Multi-Agent framework
Project description
Autobyteus
Autobyteus is an open-source, application-first agentic framework for Python. It is designed to help developers build, test, and deploy complex, stateful, and extensible AI agents by providing a robust architecture and a powerful set of tools.
Architecture
Autobyteus is built with a modular, event-driven architecture designed for extensibility and clear separation of concerns. The key components are:
- Agent Core: The heart of the system. Each agent is a stateful, autonomous entity that runs as a background process in its own thread, managed by a dedicated
AgentWorker. This design makes every agent a truly independent entity capable of handling long-running tasks. - Agent Teams: (Design Doc) The framework provides powerful constructs for building hierarchical multi-agent systems. The
AgentTeammodule allows you to compose teams of individual agents and even nest teams within other teams, enabling sophisticated, real-world organizational structures and delegation patterns. - Context & Configuration: Agent behavior is defined through a static configuration (
AgentConfig) and its dynamic state is managed inAgentRuntimeState. These are bundled into a comprehensiveAgentContextthat is passed to all components, providing a single source of truth. - Event-Driven System: (Design Doc) Agents operate on an internal
asyncioevent loop. User messages, tool results, and internal signals are handled as events, which are processed by dedicatedEventHandlers. This decouples logic and makes the system highly extensible. - Pluggable Processors & Hooks: The framework provides a chain of extension points to inject custom logic at every major step of an agent's reasoning loop. This architecture powers features like flexible tool format parsing. You can customize behavior by implementing:
InputProcessors: To modify or enrich user messages before they are sent to the LLM.LLMResponseProcessors: To parse the LLM's raw output and extract structured actions, such as tool calls.ToolExecutionResultProcessors(Tool Result Processors): To modify the result from a tool before it is sent back to the LLM for the next step of reasoning (e.g., formatting, summarization, artifact extraction).- Lifecycle Event Processors: To run custom code on specific lifecycle events (e.g.,
BEFORE_LLM_CALL,AFTER_TOOL_EXECUTE).
- Context-Aware Tooling: Tools are first-class citizens that receive the agent's full
AgentContextduring execution. This allows tools to be deeply integrated with the agent's state, configuration, and workspace, enabling more intelligent and powerful actions. - Tool Approval Flow: The framework has native support for human-in-the-loop workflows. By setting
auto_execute_tools=Falsein the agent's configuration, the agent will pause before executing a tool, emit an event requesting permission, and wait for external approval before proceeding. - MCP Integration: The framework has native support for the Model Context Protocol (MCP). This allows agents to discover and use tools from external, language-agnostic tool servers, making the ecosystem extremely flexible and ready for enterprise integration.
- Agent Skills: (Design Doc) A powerful mechanism for extending agent capabilities using modular, file-based skills. Each skill is a directory containing a map (
SKILL.md) and arbitrary assets (code, docs, templates). Skills can be preloaded or dynamically fetched via theload_skilltool, enabling human-like, just-in-time retrieval without bloating the context window.
Key Features
Interactive TUI Dashboard
Launch and monitor your agent teams with our built-in Textual-based TUI.
- Hierarchical View: See the structure of your team, including sub-teams and their agents.
- Real-Time Status: Agent and team statuses are updated live, showing you who is idle, thinking, or executing a tool.
- Detailed Logs: Select any agent to view a detailed, streaming log of their thoughts, actions, and tool interactions.
- Live Task Plan: Watch your team's
TaskPlanupdate in real-time as the coordinator publishes a plan and agents complete their tasks.
| TUI - Detailed Agent Log | TUI - Task Plan with Completed Task |
|---|---|
Fluent Team Building
Define complex agent and team structures with an intuitive, fluent API. The AgentTeamBuilder makes composing your team simple and readable.
# --- From the Multi-Researcher Team Example ---
research_team = (
AgentTeamBuilder(
name="MultiSpecialistResearchTeam",
description="A team for delegating to multiple specialists."
)
.set_coordinator(coordinator_config)
.add_agent_node(researcher_web_config)
.add_agent_node(researcher_db_config)
.build()
)
Flexible Tool Formatting (JSON & XML)
Autobyteus intelligently handles tool communication with LLMs while giving you full control.
- Provider-Aware by Default: The framework automatically generates tool manifests in the optimal format for the selected LLM provider (e.g., JSON for OpenAI/Gemini, XML for Anthropic).
- Format Override via Env: Set
AUTOBYTEUS_STREAM_PARSER=xml(orjson) to force tool-call formatting to that format regardless of provider. This can be useful for consistency or for large, complex schemas.
Flexible Communication Protocols
Choose the collaboration pattern that best fits your use case with configurable TaskNotificationModes.
- Env Override: Set
AUTOBYTEUS_TASK_NOTIFICATION_MODE=system_event_driven(oragent_manual_notification) to pick the default for all teams. AGENT_MANUAL_NOTIFICATION(Default): A traditional approach where a coordinator agent is responsible for creating a plan and then explicitly notifying other agents to begin their work via messages.SYSTEM_EVENT_DRIVEN: A more automated approach where the coordinator's only job is to publish a plan to theTaskPlan. The framework then monitors the board and automatically notifies agents when their tasks become unblocked, enabling parallel execution and reducing coordinator overhead.
Requirements
- Python Version: Python 3.11.x is the supported version for this project (>=3.11,<3.12). Using other versions may cause dependency conflicts.
- Platform Support:
- Linux/macOS: Full support for all tools.
- Windows: Supported via WSL (Windows Subsystem for Linux).
- WSL Required: Terminal tools (
run_bash, etc.) require WSL installed (wsl --install) and an active Linux distribution. - Default Distro: If you have multiple WSL distros, set Ubuntu as the default to avoid Docker's minimal distro:
wsl -l -vwsl --set-default Ubuntu
- Dependency:
tmuxis required inside WSL for terminal integration on Windows. - For detailed Windows setup, see the Terminal Tools Documentation.
- WSL Required: Terminal tools (
Getting Started
Installation
-
Clone the repository:
git clone https://github.com/your-username/autobyteus.git cd autobyteus
-
Create a local
uvenvironment (recommended):uv venv .venv --python 3.11
-
Install dependencies:
- For users:
uv sync - For developers:
uv sync --extra dev
- For users:
-
Set up Environment Variables: Create a
.envfile in the root of the project and add your LLM provider API keys:# .env OPENAI_API_KEY="sk-..." KIMI_API_KEY="your-kimi-api-key" # etc.
Running the Examples
The best way to experience Autobyteus is to run one of the included examples. The event-driven software engineering team is a great showcase of the framework's capabilities.
# Run the event-driven software engineering team example
python autobyteus/examples/agent_team/event_driven/run_software_engineering_team.py --llm-model gpt-4o
# Run the hierarchical debate team example
python autobyteus/examples/agent_team/manual_notification/run_debate_team.py --llm-model gpt-4-turbo
# Run the hierarchical skills example (modular, file-based capabilities)
python examples/run_agent_with_skill.py --llm-model gpt-4o
You can see all available models and their identifiers by running an example with the --help-models flag.
Testing
Streamable HTTP MCP integration
Some integration tests rely on the toy streamable MCP server that lives in
autobyteus_mcps/streamable_http_mcp_toy. Start it in a separate terminal
before running the test, for example:
cd autobyteus_mcps/streamable_http_mcp_toy
python src/streamable_http_mcp_toy/server.py --host 127.0.0.1 --port 8764
With the server running, execute the HTTP transport test:
uv run python -m pytest tests/integration_tests/tools/mcp/test_http_managed_server_integration.py
If you bind the server elsewhere, set STREAMABLE_HTTP_MCP_URL to the full
http:// or https:// endpoint before running pytest so the test can find it.
Secure WebSocket (WSS) MCP integration
The toy WebSocket MCP server lives in autobyteus_mcps/wss_mcp_toy. It exposes
the same diagnostic tools as the HTTP toy server but requires TLS and an Origin
header. To exercise the WebSocket transport:
-
In a separate terminal start the toy server:
cd autobyteus_mcps/wss_mcp_toy python3 -m venv .venv source .venv/bin/activate pip install -e . ./scripts/generate-dev-cert.sh # creates certs/dev-cert.pem + certs/dev-key.pem wss-mcp-toy --cert certs/dev-cert.pem --key certs/dev-key.pem --host 127.0.0.1 --port 8765 --allowed-origin https://localhost
-
Run the WebSocket transport test (defaults assume the process above is listening on
wss://127.0.0.1:8765/mcp):uv run python -m pytest tests/integration_tests/tools/mcp/test_websocket_managed_server_integration.py
Customize the target URL or TLS behavior via environment variables when running pytest:
WSS_MCP_URL– fullws://orwss://endpoint (defaultwss://127.0.0.1:8765/mcp).WSS_MCP_ORIGIN– Origin header value (defaulthttps://localhost).WSS_MCP_VERIFY_TLS– set totrue/1to enforce TLS verification (defaultfalsefor the self-signed dev cert).WSS_MCP_CA_FILE,WSS_MCP_CLIENT_CERT,WSS_MCP_CLIENT_KEY– optional paths if you want to trust a custom CA or present a client certificate.
Building the Library
To build Autobyteus as a distributable package, follow these steps:
-
Ensure dev dependencies are installed:
uv sync --extra dev
-
Build the distribution packages defined in
pyproject.toml:uv run python -m build
This will create a dist directory containing the sdist and wheel artifacts.
Contributing
(Add guidelines for contributing to the project)
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file autobyteus-1.4.2.tar.gz.
File metadata
- Download URL: autobyteus-1.4.2.tar.gz
- Upload date:
- Size: 395.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ea03bc6d7a848cae7459f445d0eedda495f13bc82eb887c35cc2f8417231318
|
|
| MD5 |
e3b1f22418bd6ba93f33d8f397bf909b
|
|
| BLAKE2b-256 |
2ee5f18301d3d6f80375c7befd5faf696e2e3ff62c1c94bb96116d53d02f31df
|
Provenance
The following attestation bundles were made for autobyteus-1.4.2.tar.gz:
Publisher:
release.yml on AutoByteus/autobyteus
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
autobyteus-1.4.2.tar.gz -
Subject digest:
4ea03bc6d7a848cae7459f445d0eedda495f13bc82eb887c35cc2f8417231318 - Sigstore transparency entry: 1027165250
- Sigstore integration time:
-
Permalink:
AutoByteus/autobyteus@ab124006e99339505f7d6eab202cefc181a54903 -
Branch / Tag:
refs/tags/v1.4.2 - Owner: https://github.com/AutoByteus
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ab124006e99339505f7d6eab202cefc181a54903 -
Trigger Event:
push
-
Statement type:
File details
Details for the file autobyteus-1.4.2-py3-none-any.whl.
File metadata
- Download URL: autobyteus-1.4.2-py3-none-any.whl
- Upload date:
- Size: 668.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c01377a4452750d49d8b3ae0fed574d56ba2a1c4dae4a9790b51654d2e84a54e
|
|
| MD5 |
14b8ad3a1e180d5cd233074b8ff835d1
|
|
| BLAKE2b-256 |
2122ae7455db987fbec50466f8ca9ec9a6d173b5e38382cb8c891ac9e3c738be
|
Provenance
The following attestation bundles were made for autobyteus-1.4.2-py3-none-any.whl:
Publisher:
release.yml on AutoByteus/autobyteus
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
autobyteus-1.4.2-py3-none-any.whl -
Subject digest:
c01377a4452750d49d8b3ae0fed574d56ba2a1c4dae4a9790b51654d2e84a54e - Sigstore transparency entry: 1027165329
- Sigstore integration time:
-
Permalink:
AutoByteus/autobyteus@ab124006e99339505f7d6eab202cefc181a54903 -
Branch / Tag:
refs/tags/v1.4.2 - Owner: https://github.com/AutoByteus
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ab124006e99339505f7d6eab202cefc181a54903 -
Trigger Event:
push
-
Statement type: