Skip to main content

Chat with your documents locally.

Project description

localrag

localrag is a Python package enabling users to "chat" with their documents using a local Retrieval Augmented Generation (RAG) approach, without needing an external Large Language Model (LLM) provider.

It allows for quick, local, and easy interactions with text data, extracting and generating responses based on the content.

Features

  • Local Processing: Runs entirely on your local machine - no need to send data externally.
  • Customizable: Easy to set up with default models or specify your own.
  • Versatile: Use it for a variety of applications, from automated Q&A systems to data mining. You add files, folders or websites to the index!

Prerequisites

Before you install and start using localrag, make sure you meet the following requirements:

Ollama for Local Inference

localrag uses Ollama for local inference, particularly beneficial for macOS users. Ollama allows for easy model serving and inference. To set up Ollama:

Installation

To install localrag, simply use pip:

pip install localrag

Quick Start

Here's a quick example of how you can use localrag to chat with your documents:

Here is an example in test.txt in the docs folder:

I have a dog
import localrag
# can set device to mps or cuda:0 e.g device="mps"
# can set index location e.g index_location="my_index_loc"
# Can set system prompt with system_prompt=
my_local_rag = localrag.init()
# Add docs
my_local_rag.add_to_index("./docs")
# Chat with docs
response = my_local_rag.chat("What type of pet do I have?")
print(response.answer)
print(response.context)
# Based on the context you provided, I can determine that you have a dog. Therefore, the type of pet you have is "dog."
# [Document(page_content='I have a dog', metadata={'source': 'docs/test.txt'})]

License

This library is licensed under the Apache 2.0 License. See the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

localrag-0.1.4.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

localrag-0.1.4-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file localrag-0.1.4.tar.gz.

File metadata

  • Download URL: localrag-0.1.4.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for localrag-0.1.4.tar.gz
Algorithm Hash digest
SHA256 5aec9d16e0ee9ed22647463743ee7c969778904e714071d2baf2ab9fb9a69a19
MD5 62f47037bc02f6e1f5304f8e72953bb8
BLAKE2b-256 7d68c2d5b0b78430852c8c02b511ef7d9ae3996d5227bcb14339702c1244219e

See more details on using hashes here.

File details

Details for the file localrag-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: localrag-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for localrag-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 176485fd34955d722a8a377699ceb333d993824571a753fbb93889af2c433eac
MD5 ee07d77a3ac52b88a80fe16e6d69ce83
BLAKE2b-256 0a16f258c77b301f3ec9835b61e4eb381faeded16740bf31f02e3f197e1abd03

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page