HomePapersRLAMA
RLAMA

RLAMA

Local assistant for document question answering using RAG.

Document QARAGKnowledge management
Visit Website

Introduction

RLAMA is a powerful local assistant tool designed for document question answering by employing Retrieval-Augmented Generation (RAG) systems. It connects to local Ollama models to index and process documents efficiently. Users can create, manage, and interact with their document knowledge bases securely on their local machines.

Key Features

Document indexing for intelligent retrieval

Multi-format support (text, code, PDF, DOCX)

Interactive query sessions

Local processing with privacy

Frequently Asked Questions

What is RLAMA?

RLAMA is a powerful local assistant tool designed for document question answering by employing Retrieval-Augmented Generation (RAG) systems. It connects to local Ollama models to index and process documents efficiently. Users can create, manage, and interact with their document knowledge bases securely on their local machines.

How to use RLAMA?

To use RLAMA, first index your document folder using a command like 'rlama rag [model] [rag-name] [folder-path]'. Then, start an interactive session with 'rlama run [rag-name]' to query your documents.

What formats of documents does RLAMA support?

RLAMA supports various formats including .txt, .md, .pdf, .docx, and more.

Is my data secure when using RLAMA?

Yes, RLAMA processes everything locally on your machine, ensuring that no data leaves your computer.

Use Cases

  • Query project documentation and manuals
  • Study research papers and textbooks
  • Create secure knowledge bases for sensitive documents

How to Use