Create LLM-powered webapps with ease
Project description
Embedia
Make LLM-powered webapps with ease
Discord · Docs · Blog · Twitter
Which webapps can be created using Embedia?
- Chatbots (like ChatGPT) with feature like:
- permanent memory
- access to the web search
- correlations between multiple chats
- specific personality (or a combination of personalities)
- Natural language search (powered by Retrieval Augmented Generation) over:
- user uploaded files
- entire websites
- custom large datasets
- AI Agents that can:
- work with you to solve complex problems
- autonomously run predefined code with custom parameters based on conversation context
- Entire SAAS products that use LLMs to solve a specific problem
Why choose Embedia?
- Developer friendly: Easy to follow documentation, IntelliSense enabled
- Pre-defined common AI Agents and Tools
- LLM agnostic: Connect any LLM you want (GPT-4, Bard, Llama, Custom-trained, etc)
- Vector DB agnostic: Connect any vector DB you want (Weaviate, Pinecone, Chroma, etc)
- Graph DB agnostic: Connect any graph DB you want (Neo4j, Nebula, etc)
- Pub-sub based event system to build highly customizable workflows
- Async: Built from ground up to be async
- Lightweight: Has very few dependencies and a tiny package size
- Small dev team with a clear focus on developer experience and scalability
How to use it?
- You'll need
Python 3.8
or higher to use this library - Install using
pip install embedia
Step 1: Define your Tokenizer class
import tiktoken
from embedia import Tokenizer
class OpenAITokenizer(Tokenizer):
async def _tokenize(self, text):
return tiktoken.encoding_for_model("gpt-3.5-turbo").encode(text)
Step 2: Define your ChatLLM class
import openai
from embedia import ChatLLM
class OpenAIChatLLM(ChatLLM):
def __init__(self):
super().__init__(tokenizer=OpenAITokenizer())
openai.api_key = "YOUR-API-KEY"
async def _reply(self):
completion = await openai.ChatCompletion.acreate(
model="gpt-3.5-turbo",
messages=[{'role': msg.role, 'content': msg.content}
for msg in self.chat_history],
)
return completion.choices[0].message.content
Step 3: Run your AI Agent
import asyncio
from embedia import Persona
from embedia.agents import ToolUserAgent
from embedia.tools import PythonInterpreterTool
python_coder = OpenAIChatLLM()
asyncio.run(python_coder.set_system_prompt(
Persona.CodingLanguageExpert.format(language='Python')
))
code = asyncio.run(python_coder(
'Count number of lines of python code in the current directory'
))
tool_user = ToolUserAgent(chatllm=OpenAIChatLLM(), tools=[PythonInterpreterTool()])
asyncio.run(tool_user(code))
Quick glance over the library internals
The core classes of Embedia are:
Tokenizer
: A class that converts text into tokensLLM
: A class that interfaces with a next token generation type large language model (eg: text-davinci-003)ChatLLM
: A class that interfaces with a chat type large language model (eg: gpt-3.5-turbo)Tool
: A class that can convert any python function into a tool that can be used by the AgentEmbeddingModel
: A class that interfaces with the Embedding Model (eg: text-embedding-ada-002)VectorDB
: A class that interfaces with a vector database (eg: Weaviate)
Pre-defined Tools include:
PythonInterpreterTool
: A tool that can run python code in the python interpreterTerminalTool
: A tool that can run shell commands in the terminal- 10+ file operations tools: For reading, writing, copying, moving, deleting files / folders
Pre-defined Agents include:
ToolUserAgent
: LLM powered System-1 thinker that can run tools in a loop by reading their docstrings
Helpers include:
- Pub-sub based event system for building highly customizable workflows
Persona
: An enum class containing pre-defined system promptsTextDoc
: A class used for dealing with text documents
Learn about them more on our documentation page
How to contribute to the codebase?
This library is under active and rapid development. We'd love your contributions to make it better. To get started, you can check out contributing.md
Become a sponsor
Recurring revenue sponsors will get benefits like:
- Sponsored Screencasts with code
- Early access to Embedia's SAAS products
- Visibility on our website and social media
Partner with us
We'd love to partner with companies and libraries in the AI and web-dev ecosystem. If you'd like to get in touch, we're always active on our Discord server.
License
Copyright - Sudhanshu Passi, 2023 under the the terms of the Apache 2.0 license
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file embedia-0.0.2.tar.gz
.
File metadata
- Download URL: embedia-0.0.2.tar.gz
- Upload date:
- Size: 35.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 96870e00afaf0b846c97f1d659be32b814dea0d394f18eddcf0928a7af3ab50b |
|
MD5 | aafad8216dee68ae133e08120b95a197 |
|
BLAKE2b-256 | dd9eed9b0229c4329fb1b7f64f0ababd63bd501576ea161374929d7d1b418f60 |
File details
Details for the file embedia-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: embedia-0.0.2-py3-none-any.whl
- Upload date:
- Size: 37.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9933eef834c95bed34a732ae32dbbebdd1d7037c5fb16c75ab28263722acebf5 |
|
MD5 | a3913e7950f70d999fcf623c40902124 |
|
BLAKE2b-256 | 462a3aac783f4206029698223d9fbcbf4f0bc7fe66f58d3a20b1f8eb725aab3e |