Python package that helps to build LLM agents based on open-source models from Huggingface Hub.
Project description
OS (Open Source) LLM Agents
A library that helps to build LLM agents based on open-source models from Huggingface Hub.
Installation
From source
Run in the root dir of this repo:
pip install .
Example usage
Import needed packages:
from os_llm_agents.models import CustomLLM
from os_llm_agents.executors import AgentExecutor
import torch
from transformers import pipeline, BitsAndBytesConfig
Optional: initialize quantization config
quantization_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
)
Initialize the model:
llm = CustomLLM(
model_name="meta-llama/Meta-Llama-3-8B-Instruct",
quantization_config=quantization_config
)
Define the tool:
def multiply(**kwargs) -> int:
"""Multiply two integers together."""
n1, n2 = kwargs["n1"], kwargs["n2"]
return n1 * n2
multiply_tool = {
"name": "multiply",
"description": "Multiply two numbers",
"parameters": {
"type": "object",
"properties": {
"n1": {
"type": "int",
"description": "Number one",
},
"n2": {
"type": "int",
"description": "Number two",
},
},
"required": ["n1", "n2"],
},
"implementation": multiply, # Attach the function implementation
}
Initialize the AgentExecutor:
executor = AgentExecutor(llm=llm,
tools=[multiply_tool],
system_prompt="You are helpful assistant")
Run the agent:
chat_history = None
result = executor.invoke("What can you do for me?")
chat_history = result["chat_history"]
print("Response: ", result["response"].content)
>>> Response: I'm a helpful assistant! I can help you with a variety of tasks. I have access to a function called "multiply" that allows me to multiply two numbers. I can also provide information and answer questions to the best of my knowledge. If you need help with something specific, feel free to ask!
result = executor.invoke("Multiply 12 by 12", chat_history)
chat_history = result["chat_history"]
print("Response: ", result["response"].content)
>>> Response: 144
print(len(chat_history))
>>> 5
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
os-llm-agents-0.0.1.tar.gz
(5.6 kB
view details)
Built Distribution
File details
Details for the file os-llm-agents-0.0.1.tar.gz
.
File metadata
- Download URL: os-llm-agents-0.0.1.tar.gz
- Upload date:
- Size: 5.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 40e524fac44171d5716b23bf883200ad5c8a5e310fa79c4e5b6f6155db8c4dbd |
|
MD5 | 92e4490d270a69eb3f0d036fa4dc8419 |
|
BLAKE2b-256 | 7e54112551c7a67d15698dde369f6fdb096378a12eae4ff90003bbb6bbb6db6c |
File details
Details for the file os_llm_agents-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: os_llm_agents-0.0.1-py3-none-any.whl
- Upload date:
- Size: 6.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 56c7edf8d639f8079699ef39d6addc4272c3e7a45373e95ab0679c50df4ebd24 |
|
MD5 | 2fd36b658f0e35cac796e8e5f762c429 |
|
BLAKE2b-256 | 9e72f3af9b928d3a5ed5837ca89155fa2c6db2b90765b77d445eedacbaadd28e |