Beyond LLM is an toolkit to Build Experiment Evaluate and Observe RAG pipelines
Project description
BeyondLLM
Build - Rapid Experiment - Evaluate - Repeat
Beyond LLM offers an all-in-one toolkit for experimentation, evaluation, and deployment of Retrieval-Augmented Generation (RAG) systems, simplifying the process with automated integration, customizable evaluation metrics, and support for various Large Language Models (LLMs) tailored to specific needs, ultimately aiming to reduce LLM hallucination risks and enhance reliability.
👉 Join our Discord community!Try out a quick demo on Google Colab:
Quick install
-
To install Beyond LLM i.e., a private repo, we can use Access Token of GitHub.
git clone https://<UPDATE-WITH-YOUR-TOKEN>@github.com/aiplanethub/beyondllm.git
-
Go to the project directory
cd beyondllm
-
Install the package
pip install .
OR,
pip install -e .
When using the
-e
flag, the package is installed in editable mode. This means that if you make changes to the source code, you do not need to reinstall the package for the changes to take effect.
Install on Google Colab
!git clone https://<UPDATE-WITH-YOUR-TOKEN>@github.com/aiplanethub/beyondllm.git
%cd /content/beyondllm/
pip install .
Quickstart Guide- Chat with YouTube Video
In this quick start guide, we'll demonstrate how to create a Chat with YouTube video RAG application using Beyond LLM with less than 8 lines of code. This 8 lines of code includes:
- Getting custom data source
- Retrieving documents
- Generating LLM responses
- Evaluating embeddings
- Evaluating LLM responses
Approach-1: Using Default LLM and Embeddings
Build customised RAG in less than 5 lines of code
using Beyond LLM.
from beyondllm import source,retrieve,generator
import os
os.environ['GOOGLE_API_KEY'] = "Your Google API Key:"
data = source.fit("https://www.youtube.com/watch?v=oJJyTztI_6g",dtype="youtube",chunk_size=512,chunk_overlap=50)
retriever = retrieve.auto_retriever(data,type="normal",top_k=3)
pipeline = generator.Generate(question="what tool is video mentioning about?",retriever=retriever)
print(pipeline.call())
Approach-2: With Custom LLM and Embeddings
Beyond LLM support various Embeddings and LLMs that are two very important components in Retrieval Augmented Generation.
from beyondllm import source,retrieve,embeddings,llms,generator
import os
from getpass import getpass
os.environ['OPENAI_API_KEY'] = getpass("Your OpenAI API Key:")
data = source.fit("https://www.youtube.com/watch?v=oJJyTztI_6g",dtype="youtube",chunk_size=1024,chunk_overlap=0)
embed_model = embeddings.OpenAIEmbeddings()
retriever = retrieve.auto_retriever(data,embed_model,type="normal",top_k=4)
llm = llms.ChatOpenAIModel()
pipeline = generator.Generate(question="what tool is video mentioning about?",retriever=retriever,llm=llm)
print(pipeline.call()) #AI response
print(retriever.evaluate(llm=llm)) #evaluate embeddings
print(pipeline.get_rag_triad_evals()) #evaluate LLM response
Output
The tool mentioned in the context is called Jupiter, which is an AI Guru designed to simplify the learning of complex data science topics. Users can access Jupiter by logging into AI Planet, accessing any course for free, and then requesting explanations of topics from Jupiter in various styles, such as in the form of a movie plot. Jupiter aims to make AI education more accessible and interactive for everyone.
Hit_rate:1.0
MRR:1.0
Context relevancy Score: 8.0
Answer relevancy Score: 7.0
Groundness score: 7.666666666666667
Get in Touch
You can schedule a 1:1 meeting with our Team to get started with GenAI Stack, OpenAGI, AI Planet Open Source LLMs(Buddhi, effi and Panda Coder) and Beyond LLM. Schedule the call here: https://calendly.com/jaintarun
Contribution guidelines
Beyond LLM thrives in the rapidly evolving landscape of open-source projects. We wholeheartedly welcome contributions in various capacities, be it through innovative features, enhanced infrastructure, or refined documentation.
Acknowledgements
and the entire OpenSource community.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for beyondllm-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3fe51140bcc5a59f536135bf3f87729a9d4da55c294a7f9a9dbefcc8a6bd7af5 |
|
MD5 | 1ab87bd673e4290a4d803e5545d57d9d |
|
BLAKE2b-256 | ebb5ef084e7c306f71c45dbb04a6afe829c366779e8a4fffa01ea8908aa59f38 |