A collection of modules used for language processing and modeling
Project description
ABOUT
verba
is a framework for working with LLMs and performing NLP tasks. It is designed to offer an alternative to the overly-abstracted methods found in commercialized packages such as langchain.
----------------------------
LICENSE
GPL-3 Summary:
You may copy, distribute and modify the software as long as you track changes/dates in source files. Any modifications to or software including (via compiler) GPL-licensed code must also be made available under the GPL along with build & install instructions. In other words, any derivative work of this software shall be released under the same GPL license as the original software, meaning the modified code must be exactly as free and open-source as the original.
----------------------------
INSTALL
verba is installed using pip:
pip install verba
NOTE: In order to utilize LLM functionality you need Ollama and the Ollama Python package installed on your machine.
SEE:
https://ollama.com/download
https://github.com/ollama/ollama-python
----------------------------
Ragby: Retrieval-Augmented Generation (RAG) By Yourself
-
A collection of RAG-related methods
-----------
STEP 1) Import & Initialize:
from verba.OllamaHelper import Ragby
ragby = Ragby(chat_model = "llama3", embedding_model = "llama3")
STEP 2) Create Chunks:
- First place your input data file (.txt or .pdf) in a directory named data
- Run:
# for .txt files:
chunks_obj, chunks_path = ragby.make_chunks("my-txt-file.txt")
# for .pdf files:
chunks_obj, chunks_path = ragby.make_chunks_pdf("my-pdf-file.pdf")
- A directory named chunks will be created, containing the two returned objects
STEP 3) Create Embeddings:
- Run:
embeddings_path = ragby.make_embeddings(chunks_obj, "my-txt-file.txt")
- A directory named embeddings will be created, containing the returned object
STEP 4) Create a System Prompt that is relevant to your output:
SYSTEM_PROMPT = "You are a helpful assistant who answers questions using only the provided CONTEXT. Be as concise as possible. If you are unsure, just say 'I don't know'.\n\nCONTEXT:\n"
STEP 5) Run:
CHAT = ragby.chat(
user_prompt = "What is the summary of this article?",
system_prompt = SYSTEM_PROMPT,
text_chunks_path = chunks_path,
text_embeddings_path = embeddings_path,
)
print(CHAT)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file verba-0.1.7.tar.gz
.
File metadata
- Download URL: verba-0.1.7.tar.gz
- Upload date:
- Size: 17.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f742d60a062fcd44a3e4223c97e5b24c40b51917262f3a2a344b81f3b5e30020 |
|
MD5 | 2b90b6721ca9ba34ec25849446d05268 |
|
BLAKE2b-256 | 582cbe6a89436091a7111fc1a41f42665ae1c5fb8062d877f765860eb84ba1b2 |
File details
Details for the file verba-0.1.7-py3-none-any.whl
.
File metadata
- Download URL: verba-0.1.7-py3-none-any.whl
- Upload date:
- Size: 17.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 256bbe74527ac0576be89862f66ab98678dc4b46c7ed35b5987bc45033a49964 |
|
MD5 | af020ac98714a24eee57ff6b15e7baae |
|
BLAKE2b-256 | b814832b81d8e64bb330916978bfdd5ca49c2e9ec1b33b44a317c4d3492429e5 |