Skip to main content

A toolkit for quickly implementing llm powered functionalities.

Project description

llm-axe 🪓

PyPI - Version PyPI - Downloads Static Badge GitHub forks Hits

llm-axe is a handy little axe for developing llm powered applications.

It allows you to quickly implement complex interactions for local LLMs, such as function callers, online agents, pre-made generic agents, and more.

Installation

pip install llm-axe

Example Snippets

  A function calling LLM can be created with just 3 lines of code:
  No need for premade schemas, templates, special prompts, or specialized functions.

prompt = "I have 500 coins, I just got 200 more. How many do I have?"

llm = OllamaChat(model="llama3:instruct")
fc = FunctionCaller(llm, [get_time, get_date, get_location, add, multiply])
result = fc.get_function(prompt)
  • Online Agent
prompt = "Tell me a bit about this website:  https://toscrape.com/?"
llm = OllamaChat(model="llama3:instruct")
searcher = OnlineAgent(llm)
resp = searcher.search(prompt)

#output: Based on information from the internet, it appears that https://toscrape.com/ is a website dedicated to web scraping.
# It provides a sandbox environment for beginners and developers to learn and validate their web scraping technologies...
  • PDF Reader
llm = OllamaChat(model="llama3:instruct")
files = ["../FileOne.pdf", "../FileTwo.pdf"]
agent = PdfReader(llm)
resp = agent.ask("Summarize these documents for me", files)
  • Data Extractor
llm = OllamaChat(model="llama3:instruct")
info = read_pdf("../Example.pdf")
de = DataExtractor(llm, reply_as_json=True)
resp = de.ask(info, ["name", "email", "phone", "address"])

#output: {'Name': 'Frodo Baggins', 'Email': 'frodo@gmail.com', 'Phone': '555-555-5555', 'Address': 'Bag-End, Hobbiton, The Shire'}

See more complete examples

How to setup llm-axe with your own LLM

Features

  • Local LLM internet access with Online Agent
  • PDF Document Reader Agent
  • Premade utility Agents for common tasks
  • Compatible with any LLM, local or externally hosted
  • Built-in support for Ollama

Important Notes

The results you get from the agents are highly dependent on the capability of your LLM. An inadequate LLM will not be able to provide results that are usable with llm-axe

Testing in development was done using llama3 8b:instruct 4 bit quant

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_axe-1.1.9.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_axe-1.1.9-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file llm_axe-1.1.9.tar.gz.

File metadata

  • Download URL: llm_axe-1.1.9.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.6

File hashes

Hashes for llm_axe-1.1.9.tar.gz
Algorithm Hash digest
SHA256 8b3f55cb78e65a6f79256238a3680c39eff47e0830e6c0ff8c2c006e4aebabb6
MD5 bce55a47a0fbdb90909acb99bc7ed918
BLAKE2b-256 c8f2e8f430860279e9065ceb9a1125f240b50e4b03d9c952824b4db00b5146ad

See more details on using hashes here.

File details

Details for the file llm_axe-1.1.9-py3-none-any.whl.

File metadata

  • Download URL: llm_axe-1.1.9-py3-none-any.whl
  • Upload date:
  • Size: 16.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.6

File hashes

Hashes for llm_axe-1.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 7f5da796e75856de134caecfc8da2cd8af55300666e58caa3e6f37ccef09895e
MD5 b83d3fbbe3bb2cf7a55c2d5ad12ba900
BLAKE2b-256 46e7b9816cd124035b36c8499844c7d9e2eb967409428fbe6fe81ae62a943a1a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page