A toolkit for quickly implementing llm powered functionalities.
Project description
llm-axe 🪓
llm-axe is a handy little axe for developing llm powered applications.
It allows you to quickly implement complex interactions for local LLMs, such as function callers, online agents, pre-made generic agents, and more.
Installation
pip install llm-axe
Example Snippets
-
Online Chat Demo: Demo chat app showcasing an LLM with internet access
-
Function Calling
A function calling LLM can be created with just 3 lines of code:
No need for premade schemas, templates, special prompts, or specialized functions.
prompt = "I have 500 coins, I just got 200 more. How many do I have?"
llm = OllamaChat(model="llama3:instruct")
fc = FunctionCaller(llm, [get_time, get_date, get_location, add, multiply])
result = fc.get_function(prompt)
- Online Agent
prompt = "Tell me a bit about this website: https://toscrape.com/?"
llm = OllamaChat(model="llama3:instruct")
searcher = OnlineAgent(llm)
resp = searcher.search(prompt)
#output: Based on information from the internet, it appears that https://toscrape.com/ is a website dedicated to web scraping.
# It provides a sandbox environment for beginners and developers to learn and validate their web scraping technologies...
- PDF Reader
llm = OllamaChat(model="llama3:instruct")
files = ["../FileOne.pdf", "../FileTwo.pdf"]
agent = PdfReader(llm)
resp = agent.ask("Summarize these documents for me", files)
- Data Extractor
llm = OllamaChat(model="llama3:instruct")
info = read_pdf("../Example.pdf")
de = DataExtractor(llm, reply_as_json=True)
resp = de.ask(info, ["name", "email", "phone", "address"])
#output: {'Name': 'Frodo Baggins', 'Email': 'frodo@gmail.com', 'Phone': '555-555-5555', 'Address': 'Bag-End, Hobbiton, The Shire'}
How to setup llm-axe with your own LLM
Features
- Local LLM internet access with Online Agent
- PDF Document Reader Agent
- Premade utility Agents for common tasks
- Compatible with any LLM, local or externally hosted
- Built-in support for Ollama
Important Notes
The results you get from the agents are highly dependent on the capability of your LLM. An inadequate LLM will not be able to provide results that are usable with llm-axe
Testing in development was done using llama3 8b:instruct 4 bit quant
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_axe-1.1.8.tar.gz
.
File metadata
- Download URL: llm_axe-1.1.8.tar.gz
- Upload date:
- Size: 17.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | eb785f546f40e9d6411e08f9895defe7421d66f20f898dcbee587dcd50fee8c1 |
|
MD5 | ef95e2e17274d10212a8589874a5be4e |
|
BLAKE2b-256 | df8d86e1658d28f044bbf98aea024097558c9edc0d97b375bfe3776cefd7a20c |
File details
Details for the file llm_axe-1.1.8-py3-none-any.whl
.
File metadata
- Download URL: llm_axe-1.1.8-py3-none-any.whl
- Upload date:
- Size: 16.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8405082b42eb0a360daea83125cf57183b24454d49fc0863f250542af2c7d629 |
|
MD5 | 2fb955e4ba70f7ca8bfb562fb7fcc0c4 |
|
BLAKE2b-256 | 35b9847402f923c39017082e5dd01d23c5cea7c1e67b21ce032776c2b82dee5e |