A comprehensive Python library for Google's Agent-to-Agent (A2A) protocol
Project description
Python A2A
🌟 Overview
Python A2A is a powerful, easy-to-use library for implementing Google's Agent-to-Agent (A2A) protocol. It enables seamless communication between AI agents, creating interoperable agent ecosystems that can collaborate to solve complex problems.
Whether you're building specialized agents with distinct capabilities, orchestrating complex workflows, or creating modular AI systems, Python A2A makes it simple to implement the A2A protocol in your applications.
🚀 Key Features
- Complete Protocol Implementation: Full implementation of Google's A2A protocol specification
- Message & Conversation Models: Robust data models for A2A messages and conversations
- HTTP Client & Server: Easy-to-use HTTP client and server components
- LLM Integration: Built-in support for OpenAI, Anthropic (Claude), and HuggingFace models
- Function Calling: First-class support for function calling between agents
- CLI Tools: Command-line interface for interacting with A2A agents
- Comprehensive Validation: Robust validation and error handling
- Type Hints: Complete type annotations for better IDE support
- Thorough Documentation: Detailed documentation and examples
📦 Installation
pip install python-a2a
🔍 Quick Start
Creating a Simple A2A Agent
from python_a2a import A2AServer, Message, TextContent, MessageRole, run_server
class EchoAgent(A2AServer):
def handle_message(self, message):
if message.content.type == "text":
return Message(
content=TextContent(text=f"Echo: {message.content.text}"),
role=MessageRole.AGENT,
parent_message_id=message.message_id,
conversation_id=message.conversation_id
)
# Run the server
agent = EchoAgent()
run_server(agent, host="0.0.0.0", port=5000)
Sending Messages to an Agent
from python_a2a import A2AClient, Message, TextContent, MessageRole
# Create a client
client = A2AClient("http://localhost:5000/a2a")
# Create a message
message = Message(
content=TextContent(text="Hello, agent!"),
role=MessageRole.USER
)
# Send the message and get a response
response = client.send_message(message)
print(f"Agent response: {response.content.text}")
Creating an LLM-Powered Agent
from python_a2a import OpenAIA2AServer, run_server
import os
# Create an OpenAI-powered agent
agent = OpenAIA2AServer(
api_key=os.environ.get("OPENAI_API_KEY"),
model="gpt-4",
system_prompt="You are a helpful assistant."
)
# Run the server
run_server(agent, host="0.0.0.0", port=5000)
Chaining Multiple Agents
from python_a2a import A2AClient, Message, TextContent, MessageRole
# Create clients for different specialized agents
weather_client = A2AClient("http://localhost:5001/a2a")
planning_client = A2AClient("http://localhost:5002/a2a")
# Ask the weather agent about the forecast
weather_message = Message(
content=TextContent(text="What's the weather like in Tokyo?"),
role=MessageRole.USER
)
weather_response = weather_client.send_message(weather_message)
# Use the weather information to ask the planning agent for recommendations
planning_message = Message(
content=TextContent(
text=f"I'm planning a trip to Tokyo. Here's the weather forecast: {weather_response.content.text}"
),
role=MessageRole.USER
)
planning_response = planning_client.send_message(planning_message)
print(planning_response.content.text)
📚 Detailed Documentation
For more detailed documentation and examples, please check out our Documentation Site.
💡 Why A2A Matters
The Agent-to-Agent (A2A) protocol enables a new paradigm of interoperable AI systems with several key benefits:
- Specialization: Agents can excel at specific tasks rather than trying to do everything
- Modularity: Components can be improved or replaced independently
- Composability: Agents can be combined in different ways to solve new problems
- Robustness: If one agent fails, others can continue to operate
- Scalability: Complex workflows can be broken down into manageable pieces
A2A allows developers to create ecosystems of agents that can collaborate to solve complex problems that would be difficult for a single agent to handle alone.
📋 Example Use Cases
- Multi-step reasoning: Break down complex reasoning into specialized steps
- Tool use: Connect LLMs to specialized agents that access tools and APIs
- Customer service: Route customer queries to specialized agent bots
- Research assistants: Combine agents for literature search, data analysis, and summary generation
- Collaborative writing: Connect agents for ideation, drafting, editing, and fact-checking
- Enterprise systems: Integrate agents that interface with different internal tools and databases
🔗 Resources
- Google A2A Protocol Documentation
- Google A2A GitHub Repository
- Google Developers Blog: A2A - A New Era of Agent Interoperability
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
👨💻 Author
Manoj Desai
- GitHub: themanojdesai
- LinkedIn: themanojdesai
- Medium: @the_manoj_desai
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python_a2a-0.1.0.tar.gz.
File metadata
- Download URL: python_a2a-0.1.0.tar.gz
- Upload date:
- Size: 29.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2cb0edeb61d88881e07d903e23aeba8770a15cac6a8dee30c4c0b789dec395cb
|
|
| MD5 |
29bc717744adb60c92133aaa315f6a79
|
|
| BLAKE2b-256 |
8901e15724c8a66dad61e0438fc699c8a818aeea477f88d62a267b1e6b439ccc
|
File details
Details for the file python_a2a-0.1.0-py3-none-any.whl.
File metadata
- Download URL: python_a2a-0.1.0-py3-none-any.whl
- Upload date:
- Size: 40.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bba8455c6f830cd2b7bfe5929c7a472bd083ee4e886eb1d047ec109fe0975cf0
|
|
| MD5 |
ca1ade0b931dc52dbcb8e3189b3e8cc1
|
|
| BLAKE2b-256 |
e734b58cc2ae28adcab76ed2fdc64dd42ed8c916c4930be28d19aa0a32ecc44d
|