PyAgenity is a Python framework for building, orchestrating, and managing multi-agent systems. Designed for flexibility and scalability, PyAgenity enables developers to create intelligent agents that collaborate, communicate, and solve complex tasks together.
Project description
PyAgenity
PyAgenity is a lightweight Python framework for building intelligent agents and orchestrating multi-agent workflows on top of the LiteLLM unified LLM interface.
PyAgenity is a lightweight Python framework for building intelligent agents and orchestrating multi-agent workflows on top of the LiteLLM unified LLM interface.
Features
- Unified
Agentabstraction (no raw LiteLLM objects leaked) - Structured responses with
content, optionalthinking, andusage - Streaming support with incremental chunks
- Final message hooks for persistence/logging
- LangGraph-inspired graph engine: nodes, conditional edges, pause/resume (human-in-loop)
- In-memory session state store (pluggable in the future)
Installation
With uv (recommended):
uv pip install pyagenity
Or with pip:
pip install pyagenity
Development vs. Library Usage
Library consumers:
- Only
pyproject.tomlis needed. It contains runtime dependencies and build metadata. - Install with pip or uv as shown above.
Development contributors:
- Use
pyproject.dev.tomlfor all development tools, linters, test runners, and optional extras. - Install dev dependencies with:
# Option 1: pip (from requirements-dev.txt)
python -m venv .venv
source .venv/bin/activate
pip install -r requirements-dev.txt
# Option 2: pip (editable install with extras, if supported)
pip install -e .[dev]
# Option 3: uv (if you use uv)
uv pip install -r requirements-dev.txt
Note:
pyproject.dev.tomlcontains all dev/test/docs/mail extras and tool configs (ruff, isort, mypy, pytest, bandit, etc.).pyproject.tomlis minimal and safe for use as a library dependency in other projects.
Set provider API keys (example for OpenAI):
export OPENAI_API_KEY=sk-... # required for gpt-* models
If you have a .env file, it will be auto-loaded (via python-dotenv).
See example/graph_demo.py for a runnable example.
Example: React Weather Agent
This repository includes a small example agent that demonstrates tool injection and a simple tool node for returning weather information. The example is in examples/react/react_weather_agent.py and uses an in-memory checkpointer. It's intended as a runnable demo showing how to register ToolNodes, conditional edges, and invoke the compiled graph.
Key points:
- Demonstrates an injectable tool signature that receives
tool_call_idandstate. - Shows how to add
ToolNodeand conditional edges into aStateGraph. - Uses
litellm.completionto call an LLM (set your provider keys as environment vars or use a.env).
Excerpt (simplified) from examples/react/react_weather_agent.py:
from dotenv import load_dotenv
from litellm import completion
from pyagenity.checkpointer import InMemoryCheckpointer
from pyagenity.graph import StateGraph, ToolNode
from pyagenity.state.agent_state import AgentState
from pyagenity.utils import Message
from pyagenity.utils.constants import END
from pyagenity.utils.converter import convert_messages
from pyagenity.utils.injectable import InjectState, InjectToolCallID
load_dotenv()
checkpointer = InMemoryCheckpointer()
def get_weather(location: str, tool_call_id: InjectToolCallID, state: InjectState) -> str:
"""Simple demo tool that prints injected params and returns a fake weather string."""
if tool_call_id:
print(f"Tool call ID: {tool_call_id}")
if state and hasattr(state, "context"):
print(f"Number of messages in context: {len(state.context)}") # type: ignore
return f"The weather in {location} is sunny."
# Build graph and compile
tool_node = ToolNode([get_weather])
def main_agent(state: AgentState, config: dict[str, any], checkpointer=None, store=None):
# Build messages and call the LLM. Use tools when appropriate.
prompts = """
You are a helpful assistant.
Your task is to assist the user in finding information and answering questions.
"""
messages = convert_messages(system_prompts=[{"role": "system", "content": prompts}], state=state)
# If last message is a tool result, return final response without tools
if state.context and state.context[-1].role == "tool" and state.context[-1].tool_call_id is not None:
response = completion(model="your-model", messages=messages)
else:
response = completion(model="your-model", messages=messages, tools=tool_node.all_tools())
return response
# Build graph, compile and invoke (see full file for details)
How to run the example locally
- Install dependencies (recommended in a virtualenv):
pip install -r requirements.txt
# or if you use uv
uv pip install -r requirements.txt
- Set your LLM provider API key (for example OpenAI):
export OPENAI_API_KEY="sk-..."
# or create a .env with the key and the script will load it automatically
- Run the example script:
python examples/react/react_weather_agent.py
Notes
- The example uses
litellm'scompletionfunction — setmodelto a provider/model available in your environment (for examplegpt-4o-minior other supported model strings). InMemoryCheckpointeris for demo/testing only. Replace with a persistent checkpointer for production.
Human-in-the-Loop
A HumanInputNode causes execution to pause (WAITING_HUMAN) if human_input is absent. Provide input via resume(session_id, human_input=...).
Final Hooks
Use GraphExecutor.add_final_hook(callable) to register hooks invoked when a session reaches COMPLETED or FAILED.
Roadmap
- Persistent state backend (Redis, SQL, etc.)
- Parallel / branching strategies and selection policies
- Tool invocation nodes & function calling wrappers
- Tracing / telemetry integration
License
MIT
Project Links
Publishing to PyPI
- Build the package:
uv pip install build twine python -m build
- Upload to PyPI:
uv pip install twine twine upload dist/*
For test uploads, use TestPyPI:
twine upload --repository testpypi dist/*
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pyagenity-0.1.1.tar.gz.
File metadata
- Download URL: pyagenity-0.1.1.tar.gz
- Upload date:
- Size: 39.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ffb5638e11f21ac54011ea1e9c4ffe796268bbe5099c7c9fbcfe97d12fe10953
|
|
| MD5 |
ca3ce3a4e1396da9a36e8adc8c72c2f5
|
|
| BLAKE2b-256 |
d783636d95393593512da33d22e098732e342614e0c56acb21e829f52d43d10d
|
File details
Details for the file pyagenity-0.1.1-py3-none-any.whl.
File metadata
- Download URL: pyagenity-0.1.1-py3-none-any.whl
- Upload date:
- Size: 45.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
68766142ecb1a84fc624e1f88f77ef39765087556cc7b8581e468601529800e3
|
|
| MD5 |
66076a7f6abedae3548540d32c0b6800
|
|
| BLAKE2b-256 |
4555848a711b0b881a171c4e2a76a94a4a3f9ed2ec62620615eb4761a77f1a0e
|