Build LangGraph agents with large numbers of tools
Project description
langgraph-bigtool
langgraph-bigtool is a Python library for creating
LangGraph agents that can access large
numbers of tools. It leverages LangGraph's long-term
memory store
to allow an agent to search for and retrieve relevant tools for a given problem.
Features
- 🧰 Scalable access to tools: Equip agents with hundreds or thousands of tools.
- 📝 Storage of tool metadata: Control storage of tool descriptions, namespaces, and other information through LangGraph's built-in persistence layer. Includes support for in-memory and Postgres backends.
- 💡 Customization of tool retrieval: Optionally define custom functions for tool retrieval.
This library is built on top of LangGraph, a powerful framework for building agent applications, and comes with out-of-box support for streaming, short-term and long-term memory and human-in-the-loop.
Installation
pip install langgraph-bigtool
Quickstart
We demonstrate langgraph-bigtool by equipping an agent with all functions from
Python's built-in math library.
[!NOTE] This includes about 50 tools. Some LLMs can handle this number of tools together in a single invocation without issue. This example is for demonstration purposes.
pip install langgraph-bigtool "langchain[openai]"
export OPENAI_API_KEY=<your_api_key>
import math
import types
import uuid
from langchain.chat_models import init_chat_model
from langchain.embeddings import init_embeddings
from langgraph.store.memory import InMemoryStore
from langgraph_bigtool import create_agent
from langgraph_bigtool.utils import (
convert_positional_only_function_to_tool
)
# Collect functions from `math` built-in
all_tools = []
for function_name in dir(math):
function = getattr(math, function_name)
if not isinstance(
function, types.BuiltinFunctionType
):
continue
# This is an idiosyncrasy of the `math` library
if tool := convert_positional_only_function_to_tool(
function
):
all_tools.append(tool)
# Create registry of tools. This is a dict mapping
# identifiers to tool instances.
tool_registry = {
str(uuid.uuid4()): tool
for tool in all_tools
}
# Index tool names and descriptions in the LangGraph
# Store. Here we use a simple in-memory store.
embeddings = init_embeddings("openai:text-embedding-3-small")
store = InMemoryStore(
index={
"embed": embeddings,
"dims": 1536,
"fields": ["description"],
}
)
for tool_id, tool in tool_registry.items():
store.put(
("tools",),
tool_id,
{
"description": f"{tool.name}: {tool.description}",
},
)
# Initialize agent
llm = init_chat_model("openai:gpt-4o-mini")
builder = create_agent(llm, tool_registry)
agent = builder.compile(store=store)
agent
query = "Use available tools to calculate arc cosine of 0.5."
# Test it out
for step in agent.stream(
{"messages": query},
stream_mode="updates",
):
for _, update in step.items():
for message in update.get("messages", []):
message.pretty_print()
================================== Ai Message ==================================
Tool Calls:
retrieve_tools (call_nYZy6waIhivg94ZFhz3ju4K0)
Call ID: call_nYZy6waIhivg94ZFhz3ju4K0
Args:
query: arc cosine calculation
================================= Tool Message =================================
Available tools: ['cos', 'acos']
================================== Ai Message ==================================
Tool Calls:
acos (call_ynI4zBlJqXg4jfR21fVKDTTD)
Call ID: call_ynI4zBlJqXg4jfR21fVKDTTD
Args:
x: 0.5
================================= Tool Message =================================
Name: acos
1.0471975511965976
================================== Ai Message ==================================
The arc cosine of 0.5 is approximately 1.0472 radians.
Customizing tool retrieval
langgraph-bigtool equips an agent with a tool that is used to retrieve tools in
the registry. You can customize the retrieval by passing retrieve_tools_function
and / or retrieve_tools_coroutine into create_agent. These functions are expected
to return a list of IDs as output.
from langgraph.prebuilt import InjectedStore
from langgraph.store.base import BaseStore
from typing_extensions import Annotated
def retrieve_tools(
query: str,
# Add custom arguments here...
*,
store: Annotated[BaseStore, InjectedStore],
) -> list[str]:
"""Retrieve a tool to use, given a search query."""
results = store.search(("tools",), query=query, limit=2)
tool_ids = [result.key for result in results]
# Insert your custom logic here...
return tool_ids
builder = create_agent(
llm, tool_registry, retrieve_tools_function=retrieve_tools
)
agent = builder.compile(store=store)
Retrieving tools without LangGraph Store
You can implement arbitrary logic for the tool retrieval, which does not have to run semantic search against a query. Below, we return collections of tools corresponding to categories:
tool_registry = {
"id_1": get_balance,
"id_2": get_history,
"id_3": create_ticket,
}
def retrieve_tools(
category: Literal["billing", "service"],
) -> list[str]:
"""Get tools for a category."""
if category == "billing":
return ["id_1", "id_2"]
else:
return ["id_3"]
[!TIP] Because the argument schema is inferred from type hints, type hinting the function argument as a
Literalwill signal that the LLM should populate a categorical value.
Related work
-
Toolshed: Scale Tool-Equipped Agents with Advanced RAG-Tool Fusion and Tool Knowledge Bases - Lumer, E., Subbiah, V.K., Burke, J.A., Basavaraju, P.H. & Huber, A. (2024). arXiv:2410.14594.
-
Graph RAG-Tool Fusion - Lumer, E., Basavaraju, P.H., Mason, M., Burke, J.A. & Subbiah, V.K. (2025). arXiv:2502.07223.
-
Retrieval Models Aren't Tool-Savvy: Benchmarking Tool Retrieval for Large Language Models - Shi, Z., Wang, Y., Yan, L., Ren, P., Wang, S., Yin, D. & Ren, Z. arXiv:2503.01763.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langgraph_bigtool-0.0.3.tar.gz.
File metadata
- Download URL: langgraph_bigtool-0.0.3.tar.gz
- Upload date:
- Size: 11.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f0caa0d9322487b7bb53f97177f16a45322946f48d2eba80c7f1e93b3c1d51e5
|
|
| MD5 |
775552564c2dbe0fdba15294b36e7a6e
|
|
| BLAKE2b-256 |
6ae089146c9f26480d86f73328ba642c41f4e543d5ded5ab7147a09359958085
|
File details
Details for the file langgraph_bigtool-0.0.3-py3-none-any.whl.
File metadata
- Download URL: langgraph_bigtool-0.0.3-py3-none-any.whl
- Upload date:
- Size: 8.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7535c924708ca7ecfccaf57b582db331cf8f21a3cd30b60abc912a7aa0db7594
|
|
| MD5 |
ab4d239de17675063915ab86db127b72
|
|
| BLAKE2b-256 |
2968973db3b37fcd262f9685d2b0449927fc1b847d5ed1b9c08fc4f8146aa553
|