Skip to main content

PyAgenity is a Python framework for building, orchestrating, and managing multi-agent systems. Designed for flexibility and scalability, PyAgenity enables developers to create intelligent agents that collaborate, communicate, and solve complex tasks together.

Project description

PyAgenity

PyPI License Python

PyAgenity is a lightweight Python framework for building intelligent agents and orchestrating multi-agent workflows on top of the LiteLLM unified LLM interface.

PyAgenity is a lightweight Python framework for building intelligent agents and orchestrating multi-agent workflows on top of the LiteLLM unified LLM interface.


Features

  • Unified Agent abstraction (no raw LiteLLM objects leaked)
  • Structured responses with content, optional thinking, and usage
  • Streaming support with incremental chunks
  • Final message hooks for persistence/logging
  • LangGraph-inspired graph engine: nodes, conditional edges, pause/resume (human-in-loop)
  • In-memory session state store (pluggable in the future)

Installation

With uv (recommended):

uv pip install pyagenity

Or with pip:

pip install pyagenity

Development vs. Library Usage

Library consumers:

  • Only pyproject.toml is needed. It contains runtime dependencies and build metadata.
  • Install with pip or uv as shown above.

Development contributors:

  • Use pyproject.dev.toml for all development tools, linters, test runners, and optional extras.
  • Install dev dependencies with:
# Option 1: pip (from requirements-dev.txt)
python -m venv .venv
source .venv/bin/activate
pip install -r requirements-dev.txt

# Option 2: pip (editable install with extras, if supported)
pip install -e .[dev]

# Option 3: uv (if you use uv)
uv pip install -r requirements-dev.txt

Note:

  • pyproject.dev.toml contains all dev/test/docs/mail extras and tool configs (ruff, isort, mypy, pytest, bandit, etc.).
  • pyproject.toml is minimal and safe for use as a library dependency in other projects.

Set provider API keys (example for OpenAI):

export OPENAI_API_KEY=sk-...  # required for gpt-* models

If you have a .env file, it will be auto-loaded (via python-dotenv).


See example/graph_demo.py for a runnable example.

Example: React Weather Agent

This repository includes a small example agent that demonstrates tool injection and a simple tool node for returning weather information. The example is in examples/react/react_weather_agent.py and uses an in-memory checkpointer. It's intended as a runnable demo showing how to register ToolNodes, conditional edges, and invoke the compiled graph.

Key points:

  • Demonstrates an injectable tool signature that receives tool_call_id and state.
  • Shows how to add ToolNode and conditional edges into a StateGraph.
  • Uses litellm.completion to call an LLM (set your provider keys as environment vars or use a .env).

Excerpt (simplified) from examples/react/react_weather_agent.py:

from dotenv import load_dotenv
from litellm import completion

from pyagenity.checkpointer import InMemoryCheckpointer
from pyagenity.graph import StateGraph, ToolNode
from pyagenity.state.agent_state import AgentState
from pyagenity.utils import Message
from pyagenity.utils.constants import END
from pyagenity.utils.converter import convert_messages
from pyagenity.utils.injectable import InjectState, InjectToolCallID

load_dotenv()

checkpointer = InMemoryCheckpointer()

def get_weather(location: str, tool_call_id: InjectToolCallID, state: InjectState) -> str:
    """Simple demo tool that prints injected params and returns a fake weather string."""
    if tool_call_id:
        print(f"Tool call ID: {tool_call_id}")
    if state and hasattr(state, "context"):
        print(f"Number of messages in context: {len(state.context)}")  # type: ignore
    return f"The weather in {location} is sunny."

# Build graph and compile
tool_node = ToolNode([get_weather])

def main_agent(state: AgentState, config: dict[str, any], checkpointer=None, store=None):
    # Build messages and call the LLM. Use tools when appropriate.
    prompts = """
        You are a helpful assistant.
        Your task is to assist the user in finding information and answering questions.
    """

    messages = convert_messages(system_prompts=[{"role": "system", "content": prompts}], state=state)

    # If last message is a tool result, return final response without tools
    if state.context and state.context[-1].role == "tool" and state.context[-1].tool_call_id is not None:
        response = completion(model="your-model", messages=messages)
    else:
        response = completion(model="your-model", messages=messages, tools=tool_node.all_tools())

    return response

# Build graph, compile and invoke (see full file for details)

How to run the example locally

  1. Install dependencies (recommended in a virtualenv):
pip install -r requirements.txt
# or if you use uv
uv pip install -r requirements.txt
  1. Set your LLM provider API key (for example OpenAI):
export OPENAI_API_KEY="sk-..."
# or create a .env with the key and the script will load it automatically
  1. Run the example script:
python examples/react/react_weather_agent.py

Notes

  • The example uses litellm's completion function — set model to a provider/model available in your environment (for example gpt-4o-mini or other supported model strings).
  • InMemoryCheckpointer is for demo/testing only. Replace with a persistent checkpointer for production.

Human-in-the-Loop

A HumanInputNode causes execution to pause (WAITING_HUMAN) if human_input is absent. Provide input via resume(session_id, human_input=...).


Final Hooks

Use GraphExecutor.add_final_hook(callable) to register hooks invoked when a session reaches COMPLETED or FAILED.


Roadmap

  • Persistent state backend (Redis, SQL, etc.)
  • Parallel / branching strategies and selection policies
  • Tool invocation nodes & function calling wrappers
  • Tracing / telemetry integration

License

MIT


Project Links


Publishing to PyPI

  1. Build the package:
    uv pip install build twine
    python -m build
    
  2. Upload to PyPI:
    uv pip install twine
    twine upload dist/*
    

For test uploads, use TestPyPI:

twine upload --repository testpypi dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyagenity-0.1.1.tar.gz (39.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyagenity-0.1.1-py3-none-any.whl (45.6 kB view details)

Uploaded Python 3

File details

Details for the file pyagenity-0.1.1.tar.gz.

File metadata

  • Download URL: pyagenity-0.1.1.tar.gz
  • Upload date:
  • Size: 39.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for pyagenity-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ffb5638e11f21ac54011ea1e9c4ffe796268bbe5099c7c9fbcfe97d12fe10953
MD5 ca3ce3a4e1396da9a36e8adc8c72c2f5
BLAKE2b-256 d783636d95393593512da33d22e098732e342614e0c56acb21e829f52d43d10d

See more details on using hashes here.

File details

Details for the file pyagenity-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: pyagenity-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 45.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for pyagenity-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 68766142ecb1a84fc624e1f88f77ef39765087556cc7b8581e468601529800e3
MD5 66076a7f6abedae3548540d32c0b6800
BLAKE2b-256 4555848a711b0b881a171c4e2a76a94a4a3f9ed2ec62620615eb4761a77f1a0e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page