Skip to main content

A comprehensive Python library for Google's Agent-to-Agent (A2A) protocol

Project description

Python A2A

PyPI version Python Versions License: MIT Downloads Documentation Status GitHub stars

The Definitive Python Implementation of Google's Agent-to-Agent (A2A) Protocol

🌟 Overview

Python A2A is a comprehensive, production-ready library for implementing Google's Agent-to-Agent (A2A) protocol. It provides everything you need to build interoperable AI agent ecosystems that can collaborate seamlessly to solve complex problems.

The A2A protocol establishes a standard communication format that enables AI agents to interact regardless of their underlying implementation. Python A2A makes this protocol accessible with an intuitive API that developers of all skill levels can use to build sophisticated multi-agent systems.

✨ Why Choose Python A2A?

  • Complete Implementation: Fully implements the official A2A specification with zero compromises
  • Enterprise Ready: Built for production environments with robust error handling and validation
  • Framework Agnostic: Works with any Python framework (Flask, FastAPI, Django, etc.)
  • LLM Provider Flexibility: Native integrations with OpenAI, Anthropic, and HuggingFace
  • Minimal Dependencies: Core functionality requires only the requests library
  • Excellent Developer Experience: Comprehensive documentation, type hints, and examples

📦 Installation

Install the base package with minimal dependencies:

pip install python-a2a

Or install with optional components based on your needs:

# For Flask-based server support
pip install "python-a2a[server]"

# For OpenAI integration
pip install "python-a2a[openai]"

# For Anthropic Claude integration
pip install "python-a2a[anthropic]"

# For all optional dependencies
pip install "python-a2a[all]"

🚀 Quick Start Examples

1. Create a Simple A2A Agent Server

from python_a2a import A2AServer, Message, TextContent, MessageRole, run_server

class EchoAgent(A2AServer):
    """A simple agent that echoes back messages with a prefix."""
    
    def handle_message(self, message):
        if message.content.type == "text":
            return Message(
                content=TextContent(text=f"Echo: {message.content.text}"),
                role=MessageRole.AGENT,
                parent_message_id=message.message_id,
                conversation_id=message.conversation_id
            )

# Run the server
if __name__ == "__main__":
    agent = EchoAgent()
    run_server(agent, host="0.0.0.0", port=5000)

2. Send Messages to an A2A Agent

from python_a2a import A2AClient, Message, TextContent, MessageRole
from python_a2a.utils import pretty_print_message

# Create a client connected to an A2A-compatible agent
client = A2AClient("http://localhost:5000/a2a")

# Create a simple message
message = Message(
    content=TextContent(text="Hello, A2A!"),
    role=MessageRole.USER
)

# Send the message and get a response
response = client.send_message(message)

# Display the response
pretty_print_message(response)

3. Create an LLM-Powered Agent

import os
from python_a2a import OpenAIA2AServer, run_server

# Create an agent powered by OpenAI
agent = OpenAIA2AServer(
    api_key=os.environ["OPENAI_API_KEY"],
    model="gpt-4",
    system_prompt="You are a helpful AI assistant specialized in explaining complex topics simply."
)

# Run the server
if __name__ == "__main__":
    run_server(agent, host="0.0.0.0", port=5000)

4. Build an Agent Chain for Complex Tasks

from python_a2a import A2AClient, Message, TextContent, MessageRole

# Connect to specialized agents
weather_agent = A2AClient("http://localhost:5001/a2a")
planning_agent = A2AClient("http://localhost:5002/a2a")

def plan_trip(location):
    """Chain multiple agents to plan a trip."""
    # Step 1: Get weather information
    weather_message = Message(
        content=TextContent(text=f"What's the weather forecast for {location}?"),
        role=MessageRole.USER
    )
    weather_response = weather_agent.send_message(weather_message)
    
    # Step 2: Use weather data to create a trip plan
    planning_message = Message(
        content=TextContent(
            text=f"I'm planning a trip to {location}. Weather forecast: {weather_response.content.text}"
                 f"Please suggest activities and packing recommendations."
        ),
        role=MessageRole.USER
    )
    planning_response = planning_agent.send_message(planning_message)
    
    return planning_response.content.text

# Use the chained agents
trip_plan = plan_trip("Tokyo")
print(trip_plan)

🧩 Core Features

Messages and Conversations

Python A2A provides a rich set of models for A2A messages and conversations:

from python_a2a import (
    Message, TextContent, FunctionCallContent, FunctionResponseContent, 
    MessageRole, Conversation
)

# Create a conversation
conversation = Conversation()

# Add messages to the conversation
conversation.create_text_message(
    text="What's the weather like in New York?", 
    role=MessageRole.USER
)

# Add a function call message
conversation.create_function_call(
    name="get_weather",
    parameters=[
        {"name": "location", "value": "New York"},
        {"name": "unit", "value": "celsius"}
    ],
    role=MessageRole.AGENT
)

# Add a function response
conversation.create_function_response(
    name="get_weather",
    response={"temperature": 22, "conditions": "Partly Cloudy"},
    role=MessageRole.AGENT
)

Function Calling

The A2A protocol supports function calling between agents, making it easy to expose capabilities:

from python_a2a import (
    Message, FunctionCallContent, FunctionParameter, FunctionResponseContent,
    MessageRole
)

# Create a function call message
function_call = Message(
    content=FunctionCallContent(
        name="calculate",
        parameters=[
            FunctionParameter(name="operation", value="add"),
            FunctionParameter(name="a", value=5),
            FunctionParameter(name="b", value=3)
        ]
    ),
    role=MessageRole.USER
)

# Create a function response message
function_response = Message(
    content=FunctionResponseContent(
        name="calculate",
        response={"result": 8}
    ),
    role=MessageRole.AGENT,
    parent_message_id=function_call.message_id
)

Command-Line Interface

Python A2A includes a CLI for interacting with A2A agents:

# Send a message to an agent
a2a send http://localhost:5000/a2a "What's the weather like in Tokyo?"

# Start a simple A2A server
a2a serve --host 0.0.0.0 --port 5000

# Start an OpenAI-powered agent
a2a openai --api-key YOUR_API_KEY --model gpt-4

# Start an Anthropic-powered agent
a2a anthropic --api-key YOUR_API_KEY --model claude-3-opus-20240229

📖 Architecture & Design Principles

Python A2A is built on three core design principles:

  1. Protocol First: Adheres strictly to the A2A protocol specification for maximum interoperability

  2. Modularity: All components are designed to be composable and replaceable

  3. Progressive Enhancement: Start simple and add complexity only as needed

The architecture consists of four main components:

  • Models: Data structures representing A2A messages and conversations
  • Client: Components for sending messages to A2A agents
  • Server: Components for building A2A-compatible agents
  • Utils: Helper functions for common tasks

🗺️ Use Cases

Python A2A can be used to build a wide range of AI systems:

Research & Development

  • Experimentation Framework: Easily swap out different LLM backends while keeping the same agent interface
  • Benchmark Suite: Compare performance of different agent implementations on standardized tasks

Enterprise Systems

  • AI Orchestration: Coordinate multiple AI agents across different departments
  • Legacy System Integration: Wrap legacy systems with A2A interfaces for AI accessibility

Customer-Facing Applications

  • Multi-Stage Assistants: Break complex user queries into subtasks handled by specialized agents
  • Tool-Using Agents: Connect LLMs to database agents, calculation agents, and more

Education & Training

  • AI Education: Create educational systems that demonstrate agent collaboration
  • Simulation Environments: Build simulated environments where multiple agents interact

🔍 Detailed Documentation

For comprehensive documentation, tutorials, and API reference, visit:

🤝 Community & Support

⭐ Star This Repository

If you find this library useful, please consider giving it a star on GitHub! It helps others discover the project and motivates further development.

GitHub Repo stars

🛣️ Roadmap

  • Streaming Support: Streaming responses for real-time communication
  • WebSocket Support: WebSocket transport for persistent connections
  • More LLM Integrations: Support for additional LLM providers
  • Agent Registry: A registry for discovering and registering agents
  • Agent Composition Tools: Higher-level tools for composing agents

🙏 Acknowledgements

👨‍💻 Author

Manoj Desai

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


Made with ❤️ by Manoj Desai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python_a2a-0.1.2.tar.gz (33.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

python_a2a-0.1.2-py3-none-any.whl (43.2 kB view details)

Uploaded Python 3

File details

Details for the file python_a2a-0.1.2.tar.gz.

File metadata

  • Download URL: python_a2a-0.1.2.tar.gz
  • Upload date:
  • Size: 33.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for python_a2a-0.1.2.tar.gz
Algorithm Hash digest
SHA256 3d4645265330faeeb8ec36f0c939b763dc49f8abd5018c953b379594bcc170d1
MD5 5f7e2ea6d0f9895fb275c2d25707ef67
BLAKE2b-256 0ea7c0275b38a0981267478c9a3bee20da2c44ff256770b05ac01455894aa687

See more details on using hashes here.

File details

Details for the file python_a2a-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: python_a2a-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 43.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for python_a2a-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3731d7c5786846c2a9bec964eec38551d3d82c009ccd81ad86659005a732db28
MD5 cdc0884780af398d2375b868abba043a
BLAKE2b-256 84eac4481c6430278e95988b8b5f788d1bfacdd3412a11908175d7a35dc38865

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page