Agent Protocol Conductor - A production-ready protocol for decentralized AI agent orchestration
Project description
APC: Agent Protocol Conductor
A protocol for decentralized, resilient, and auditable orchestration of heterogeneous AI agent ecosystems.
Key Features:
- Distributed, production-grade agent orchestration using gRPC or WebSocket
- Real Azure OpenAI agent integration (see examples/real_world)
- Persistent checkpointing for workflow recovery and auditability
- Structured, colorized logging for clear terminal output
- Example output and reports are saved to disk for review
Logging:
- INFO: Green
- WARNING: Bold Yellow
- ERROR: Bold Red
- DEBUG: Cyan (dim)
- CRITICAL: Magenta
- Workflow summary: Bold Yellow in terminal
Output:
- All workflow results and reports are logged and saved to files in
./reports/folder - Checkpoints are saved in ./checkpoints for automatic recovery
See the example scripts in examples/real_world for real, end-to-end usage.
APC (Agent Protocol Conductor) is an open protocol and SDK designed to orchestrate distributed AI agents in a truly decentralized, resilient, and auditable way. With APC, you can build intelligent systems where multiple agentsโeach with their own roles and capabilitiesโwork together to accomplish complex tasks, adapt to failures, and recover automatically, all without relying on a central controller.
๐ฏ The Problem APC Solves: Building multi-agent systems traditionally requires 200+ lines of custom orchestration code, manual dependency management, custom protocols, and complex error handling for every project.
โก The APC Solution: Just define workflow steps and dependencies - APC handles everything else automatically! Role-based routing, dependency management, error handling, service discovery, and communication protocols are all built-in.
Key features include:
- Dynamic Leadership: Any agent can become the conductor, coordinating workflows and handing off control as needed.
- Sequenced Task Execution: Define and manage multi-step processes, with each agent performing specialized subtasks.
- Checkpointing & Failover: Progress is saved at every step, so if an agent fails, another can seamlessly take over from the last checkpointโno lost work, no manual intervention.
- Interoperability: Built on Protobuf schemas, APC supports cross-language agent ecosystems (Python, TypeScript, Java, and more).
- Extensibility & Security: Easily add new message types, enforce security with mTLS/JWT, and integrate custom business logic or LLMs.
APC is production-ready and ideal for both classic automation and advanced AI-powered workflows. Whether you're building ETL pipelines, LLM chatbots, or autonomous fleets, APC gives you the tools to create robust, scalable, and future-proof agent systems.
๐ Quick Start
๐ฅ Installation
# Install from PyPI
pip install apc-protocol
# Or from source
git clone https://github.com/deepfarkade/apc-protocol.git
cd apc-protocol
python setup.py
โญ Try APC in 30 Seconds (No Setup Required!)
# Run the simple demo - shows APC benefits immediately
python examples/real_world/apc_simple_demo.py
๐ฅ Most Popular: Real AI Workflow
# 1. Add Azure OpenAI key to .env file
# 2. Run 3-agent research workflow
python examples/real_world/simple_azure_openai_demo.py
๐งโ๐ป Basic Usage
from apc import Worker, Conductor
from apc.transport import GRPCTransport
# Create worker with specific roles
worker = Worker("my-worker", roles=["data-processor"])
# Register task handlers
@worker.register_handler("process_data")
async def handle_data(batch_id: str, step_name: str, params: dict):
# Your processing logic here
return {"processed": params["data"], "status": "completed"}
# Set up transport and start
transport = GRPCTransport(port=50051)
worker.bind_transport(transport)
await transport.start_server()
๐ ๏ธ Key Features
- Protobuf-based message schemas for cross-language interoperability
- Pluggable checkpoint manager (in-memory, Redis, S3)
- State machine engine for conductor and worker agents
- gRPC and WebSocket transport adapters
- Dynamic Leadership: Any agent can become the conductor
- Fault Tolerance: Automatic failover and recovery
- Cross-Language Support: Python, TypeScript, Java, and more
- Checkpointing: Save progress and resume from failures
- Security Ready: mTLS, JWT authentication support
๐๏ธ Architecture Overview
APC Protocol enables decentralized agent coordination with:
- Conductor Agent: The orchestrator that assigns tasks to Worker Agents based on a workflow plan. Maintains execution state and error recovery logic.
- Worker Agent: Domain-specific agents that perform specialized subtasks. They respond to commands from Conductors and return results.
- gRPC/WebSocket Layer: Communication backbone that enables bidirectional, low-latency messaging between agents.
- Checkpoint Store: Persistent storage layer used to save execution state. Enables seamless recovery without restarting entire workflows.
This modular setup enables dynamic, scalable, and fault-tolerant agent workflows where control is coordinated yet loosely coupled through standardized message passing.
๐ Examples & Tutorials
๐ฏ Value-Focused Demonstrations
Every example explicitly shows what problems APC solves and why it's essential:
| Demo | Description | Setup | Best For |
|---|---|---|---|
๐ฏ apc_simple_demo.py |
Data processing pipeline | โ None needed! | โญ Start here - No setup required |
๐ฅ simple_azure_openai_demo.py |
Research โ Analysis โ Report | โ Azure OpenAI | Most popular - Real AI workflow |
โ๏ธ anthropic_travel_planning_demo.py |
Travel planning workflow | โ Anthropic Claude | Claude AI demonstration |
๐ gemini_financial_analysis_demo.py |
Financial analysis pipeline | โ Google Gemini | Gemini AI demonstration |
๐ญ azureopenai_supply_chain_demo.py |
Supply chain management | โ Azure OpenAI | Business automation |
๐ฏ What These Demos Prove
โ WITHOUT APC (Traditional Approach):
- ๐ป ~200+ lines of custom orchestration code needed
- ๐ง Custom message passing between agents
- โฐ Manual timeout and error handling
- ๐ Complex dependency tracking and execution order
- ๐ Service discovery and agent registration
- ๐ ๏ธ Custom retry logic and failure recovery
โ WITH APC (These Examples):
- โก ~15 lines to define workflow steps and dependencies
- ๐ค Automatic role-based routing and execution
- ๐ก๏ธ Built-in timeout, error handling, and retries
- ๐ Dependency management handled automatically
- ๐ Service discovery built into the protocol
- โจ Just focus on your agent logic - APC handles the rest!
๐ Quick Setup for API Demos
-
Copy environment template:
cp .env.example .env
-
Add your API keys to
.env:# For Azure OpenAI demos (automatically detected by APC) AZURE_OPENAI_API_KEY=your_key_here AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/ AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4 AZURE_OPENAI_API_VERSION=2024-02-15-preview # Optional # For Anthropic demos (coming soon) ANTHROPIC_API_KEY=your_key_here # For Gemini demos (coming soon) GOOGLE_API_KEY=your_key_here
-
Run any demo - APC automatically detects and uses your .env settings!
python examples/real_world/simple_azure_openai_demo.py
๐๏ธ Checkpoints & Output
Checkpoint Management:
- All workflow state is automatically saved to
./checkpoints/directory - Checkpoints enable automatic recovery if conductors or agents fail
- Default checkpoint interval: 30 seconds (configurable)
- Each checkpoint includes full workflow state, timing, and recovery metadata
Output Files:
- Reports: Generated research reports are saved as
reports/azure_research_report_<batch_id>.txt - Logs: Colored, structured logging shows workflow progress in terminal
- Checkpoints: JSON files in
./checkpoints/contain complete workflow state
Project Directory Structure:
./checkpoints/ # Workflow checkpoints
โโโ azure_research_ws_1751380943.json # WebSocket workflow checkpoint
โโโ azure_research_1751378779.json # gRPC workflow checkpoint
โโโ batch_<id>_checkpoint.json # Additional workflow states
./reports/ # Generated reports
โโโ azure_research_report_ws_1751381636.txt # Latest research report
โโโ azure_research_report_1751378779.txt # Previous reports
โโโ azure_research_report_<batch_id>.txt # Additional reports
Log Colors (for easy visual tracking):
- ๐ก Yellow (WARNING): Key workflow events, progress, and results
- ๐ด Red (ERROR): Failures and critical issues
- ๐ต Cyan (DEBUG): Detailed technical information
- ๐ฃ Magenta (CRITICAL): System-level failures
- ๐ฃ Purple/Violet: LLM streaming responses and model calls
LLM Streaming Features:
- ๐จ Real-time streaming: See AI responses as they generate
- ๐ค Model identification: Clear display of which AI model is responding
- โก Agent tracking: Know which agent is making each LLM call
- ๐ Performance stats: Response time and character count displayed
๐ Additional Resources
- Complete Documentation - Architecture, message schemas, state machines, checkpointing, transport adapters, security, registry, and advanced LLM integration
- Usage Guide - Comprehensive tutorials, production deployment, and advanced examples
- Basic Examples - Simple working code to get started
- Protocol Specification - Technical details and specifications
๐ง LLM Integration & Advanced Features
๐จ Streaming LLM Support
APC now includes production-ready streaming LLM clients with automatic environment configuration and colored terminal output:
from apc.helpers.llms import AzureOpenAIStreamingClient
# Automatically loads from .env file - no manual configuration needed!
client = AzureOpenAIStreamingClient()
# Real-time streaming with purple/violet colored output
response = client.chat_completion_streaming(
agent_name="Research Agent",
messages=[{"role": "user", "content": "Analyze market trends"}],
max_tokens=500
)
Key Features:
- ๐จ Real-time colored streaming: Purple/violet terminal output during LLM generation
- ๐ง Automatic .env detection: All configuration loaded from environment variables
- ๐ Performance tracking: Token count, timing, and model identification
- ๐ฏ Agent identification: Clear labeling of which agent is making LLM calls
- ๐ก๏ธ Error handling: Graceful fallbacks and clear error messages
๐ Modular LLM Architecture
All LLM providers are organized in a clean, extensible structure:
src/apc/helpers/llms/
โโโ __init__.py # Unified exports
โโโ base.py # BaseLLMClient (inherit from this)
โโโ azure_openai.py # โ
Full implementation
โโโ anthropic.py # ๐ง Template ready
โโโ gemini.py # ๐ง Template ready
โโโ openai.py # ๐ง Template ready
โโโ custom_provider.py # ๐ง Add your own here
๐ Environment Configuration
All LLM settings are automatically loaded from your .env file:
# Azure OpenAI (Fully Supported)
AZURE_OPENAI_API_KEY=your_api_key_here
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4
AZURE_OPENAI_API_VERSION=2024-02-15-preview
# Anthropic (Template Ready)
ANTHROPIC_API_KEY=your_anthropic_api_key_here
ANTHROPIC_MODEL=claude-3-sonnet-20240229
# Google Gemini (Template Ready)
GOOGLE_API_KEY=your_google_api_key_here
GEMINI_MODEL=gemini-pro
๐ Example: Multi-Agent Research Workflow with APC
Here's how APC transforms complex multi-agent coordination:
๐ฏ Scenario: Research โ Analysis โ Report generation using 3 specialized AI agents
โ Traditional Approach (200+ lines):
# Complex custom orchestration code needed:
# - Agent discovery and registration
# - Custom message passing protocols
# - Manual dependency tracking
# - Error handling and retries
# - Timeout management
# - Data serialization/deserialization
# - Resource coordination
# ... 200+ lines of boilerplate code
โ With APC (15 lines):
# Just define the workflow - APC handles everything!
workflow = conductor.create_workflow("research_workflow")
# Step 1: Research (no dependencies)
workflow.add_step("conduct_research", required_role="researcher")
# Step 2: Analysis (waits for research)
workflow.add_step("analyze_data", required_role="analyzer",
dependencies=["conduct_research"])
# Step 3: Report (waits for analysis)
workflow.add_step("generate_report", required_role="reporter",
dependencies=["analyze_data"])
# Execute - APC orchestrates everything automatically!
result = await conductor.execute_workflow(workflow)
๐ฏ Result: APC automatically handles role-based routing, dependency management, error recovery, timeouts, and data flow between agents. No custom orchestration code needed!
๐ค Contributing
We welcome contributions! Here's how to get started:
Development Setup
git clone https://github.com/deepfarkade/apc-protocol.git
cd apc-protocol
python setup.py
python scripts/test_package.py
Key Files
proto/apc.proto- Protocol definitionssrc/apc/- Core Python SDKexamples/- Usage examplesdocs/- Documentation
Testing
# Run basic tests
python scripts/test_package.py
# Run protocol demo
python scripts/demo.py
# Test example workflows
python examples/real_world/apc_simple_demo.py
python examples/basic/simple_grpc.py
๐ฆ Release Information
- Current Release: v0.1.x (Alpha)
- See Releases for changelogs and version history.
- This is the first public alpha release of the APC protocol and SDK.
๐ก๏ธ License
MIT
๐ Advanced Topics & Detailed Comparisons
For comprehensive technical documentation including:
- Framework Comparisons: Detailed comparison with AutoGen and other multi-agent frameworks
- Protocol Evolution: Understanding MCP โ A2A โ ACP โ APC evolution
- Architecture Deep-Dive: Message schemas, state machines, transport adapters
- Real-World Scenarios: Complex deployment patterns and use cases
- Security & Production: mTLS, JWT, policy engines, enterprise deployment
See our complete documentation:
- Technical Documentation - Complete architecture and advanced features
- Usage Guide - Comprehensive tutorials and production patterns
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file apc_protocol-0.1.17.tar.gz.
File metadata
- Download URL: apc_protocol-0.1.17.tar.gz
- Upload date:
- Size: 923.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1863bbb20b02f9c0df5cfce68bc5e7f931b4144b739686c7ed7494a4f7c9641a
|
|
| MD5 |
4c186e59bcfbb5aeb6a37894b9287e36
|
|
| BLAKE2b-256 |
e2db7f9def97401d730fdd2748b907653354923695450c38466a8b02d1195cc9
|
File details
Details for the file apc_protocol-0.1.17-py3-none-any.whl.
File metadata
- Download URL: apc_protocol-0.1.17-py3-none-any.whl
- Upload date:
- Size: 43.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ee9f97129e27234935f9ac7d77a6d8ad936f8bd04aff40267a62322bb18a6db
|
|
| MD5 |
92a021f8b84cc311e3f7ee1e6d7e6adf
|
|
| BLAKE2b-256 |
5f411789b93feab55cc996dcdeda700118beef6d7eeb53b87ab20fda57ff921f
|