Create an agent that can handle a large number of tools with persistence support.
Project description
AgentAmi
AgentAmi is a flexible agentic framework built using LangGraph, designed to scale with large numbers of tools and intelligently select the most relevant ones for a given user query. It helps with decreasing token size significantly.
It supports:
- Dynamic tool selection via inbuilt (very efficient) RAG with an option to easily replace it with your own tool_selector.
- Pruner to limit context length and improve performance (it's inbuilt, you don't have to do anything).
Quick start
Refer the main.py file for a complete sample usage.
pip install agentami
from agentami import AgentAmi
from langchain.chat_models import ChatOpenAI
from langgraph.checkpoint.memory import InMemorySaver
from agentami.agents.ami import AgentAmi
# Replace ... (ellipsis) with the commented instructions
tools = [...] # List of LangChain-compatible tools
agent = AgentAmi(
model=ChatOpenAI(model="gpt-4o"),
tools=tools, # List of LangChain-compatible tools
checkpointer=InMemorySaver(), # Optional. No persistence if omitted.
# Optional parameters:
tool_selector=..., # Custom function to select relevant tools. Defaults to internal tool_selector.
top_k=..., # Number of top tools to use. Defaults to 3.
context_size=..., # Number of past user prompts to retain. Defaults to 7.
disable_pruner=..., # If True, disables pruning. May increase token usage. Defaults to False
prompt_template=... # Custom prompt template. Defaults to a generic bot template.
)
agent_ami = agent.graph # Your regular langgraph's graph.
How to integrate your own tool selector?
Just make a function that accepts (query: str, top_k: int) and parameters and returns List[str] #List of tool names.
from typing import List
# function template:
def my_own_tool_selector(query: str, top_k: int) -> List[str]:
# Your logic to select tools based on the query
return ["tool1", "tool2", "tool3"] # Return top_k selected tool names
Things you should be aware about:
- Running for the first time will take time as it installs the dependencies (models used by internal tool_selector).
- Your first
agent_ami.invoke() or agent_agent_ami.astream()may take time if you have hundreds of tools, because it initialises a vector store and embeds the tool descriptions at runtime for each AgentAmi() object - Your eventual prompts would be fine.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentami-0.1.8.tar.gz.
File metadata
- Download URL: agentami-0.1.8.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
21ace6c96bbe5dac2f5552ed076c65c601f88f54089f72f8440c62b5a7587283
|
|
| MD5 |
b38318adc5426cd339deddcf8ee3055d
|
|
| BLAKE2b-256 |
007a0cfbc309aaeffa9855b2778e3317152fd18c30c7260b34df74e12b4b2cda
|
File details
Details for the file agentami-0.1.8-py3-none-any.whl.
File metadata
- Download URL: agentami-0.1.8-py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
de5c3220629346846534bb6e860507b34e111069e8fff2dcdddea8f35460756a
|
|
| MD5 |
13755bac72a8e544fd6868a857d1a9e9
|
|
| BLAKE2b-256 |
1983b5158d46b8063d1929950d20e9ac4bec95aa71aa1f7ec9d1ed713e96dca0
|