Skip to main content

Convert an AI Agent into a A2A server! ✨

Project description

FastA2A

CI Coverage PyPI python versions license

FastA2A is an agentic framework agnostic implementation of the A2A protocol in Python. The library is designed to be used with any agentic framework, and is not exclusive to PydanticAI.

Interactive Chat

Installation

FastA2A is available on PyPI as fasta2a so installation is as simple as:

pip install fasta2a  # or `uv add fasta2a`

The only dependencies are:

Usage

To use FastA2A, you need to bring the Storage, Broker and Worker components.

FastA2A was designed with the mindset that the worker could, and should live outside the web server. i.e. you can have a worker that runs on a different machine, or even in a different process.

You can use the InMemoryStorage and InMemoryBroker to get started, but you'll need to implement the Worker to be able to execute the tasks with your agentic framework. Let's see an example:

import uuid
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from typing import Any

from fasta2a import FastA2A, Worker
from fasta2a.broker import InMemoryBroker
from fasta2a.schema import Artifact, Message, TaskIdParams, TaskSendParams, TextPart
from fasta2a.storage import InMemoryStorage

Context = list[Message]
"""The shape of the context you store in the storage."""


class InMemoryWorker(Worker[Context]):
    async def run_task(self, params: TaskSendParams) -> None:
        task = await self.storage.load_task(params['id'])
        assert task is not None

        await self.storage.update_task(task['id'], state='working')

        context = await self.storage.load_context(task['context_id']) or []
        context.extend(task.get('history', []))

        # Call your agent here...
        message = Message(
            role='agent',
            parts=[TextPart(text=f'Your context is {len(context) + 1} messages long.', kind='text')],
            kind='message',
            message_id=str(uuid.uuid4()),
        )

        # Update the new message to the context.
        context.append(message)

        artifacts = self.build_artifacts(123)
        await self.storage.update_context(task['context_id'], context)
        await self.storage.update_task(task['id'], state='completed', new_messages=[message], new_artifacts=artifacts)

    async def cancel_task(self, params: TaskIdParams) -> None: ...

    def build_message_history(self, history: list[Message]) -> list[Any]: ...

    def build_artifacts(self, result: Any) -> list[Artifact]: ...


storage = InMemoryStorage()
broker = InMemoryBroker()
worker = InMemoryWorker(storage=storage, broker=broker)


@asynccontextmanager
async def lifespan(app: FastA2A) -> AsyncIterator[None]:
    async with app.task_manager:
        async with worker.run():
            yield


app = FastA2A(storage=storage, broker=broker, lifespan=lifespan)

You can run this example as is with uvicorn main:app --reload.

Using PydanticAI

Initially, this FastA2A lived under PydanticAI repository, but since we received community feedback, we've decided to move it to a separate repository.

[!NOTE] Other agentic frameworks are welcome to implement the Worker component, and we'll be happy add the reference here.

For reference, you can check the PydanticAI implementation of the Worker.

Let's see how to use it in practice:

from pydantic_ai import Agent

agent = Agent('openai:gpt-4.1')
app = agent.to_a2a()

You can run this example as is with uvicorn main:app --reload.

As you see, it's pretty easy from the point of view of the developer using your agentic framework.

Design

FastA2A is built on top of Starlette, which means it's fully compatible with any ASGI server.

Given the nature of the A2A protocol, it's important to understand the design before using it, as a developer you'll need to provide some components:

  • Storage: to save and load tasks and the conversation context
  • Broker: to schedule tasks
  • Worker: to execute tasks

Let's have a look at how those components fit together:

flowchart TB
    Server["HTTP Server"] <--> |Sends Requests/<br>Receives Results| TM

    subgraph CC[Core Components]
        direction RL
        TM["TaskManager<br>(coordinates)"] --> |Schedules Tasks| Broker
        TM <--> Storage
        Broker["Broker<br>(queues & schedules)"] <--> Storage["Storage<br>(persistence)"]
        Broker --> |Delegates Execution| Worker
    end

    Worker["Worker<br>(implementation)"]

FastA2A allows you to bring your own Storage, Broker and Worker.

You can also leverage the in-memory implementations of Storage and Broker by using the InMemoryStorage and InMemoryBroker:

from fasta2a import InMemoryStorage, InMemoryBroker

storage = InMemoryStorage()
broker = InMemoryBroker()

Tasks and Context

FastA2A is designed to be opinionated regarding the A2A protocol. When the server receives a message, according to the specification, the server can decide between:

  • Send a stateless message back to the client
  • Create a stateful Task and run it on the background

FastA2A will always create a Task and run it on the background (on the Worker).

[!NOTE] You can read more about it here.

  • Task: Represents one complete execution of an agent. When a client sends a message to the agent, a new task is created. The agent runs until completion (or failure), and this entire execution is considered one task. The final output should be stored as a task artifact.

  • Context: Represents a conversation thread that can span multiple tasks. The A2A protocol uses a context_id to maintain conversation continuity:

    • When a new message is sent without a context_id, the server generates a new one
    • Subsequent messages can include the same context_id to continue the conversation
    • All tasks sharing the same context_id have access to the complete message history

Storage

The Storage component serves two purposes:

  1. Task Storage: Stores tasks in A2A protocol format, including their status, artifacts, and message history
  2. Context Storage: Stores conversation context in a format optimized for the specific agent implementation

This design allows for agents to store rich internal state (e.g., tool calls, reasoning traces) as well as store task-specific A2A-formatted messages and artifacts.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fasta2a-0.5.0.tar.gz (1.4 MB view details)

Uploaded Source

Built Distribution

fasta2a-0.5.0-py3-none-any.whl (25.2 kB view details)

Uploaded Python 3

File details

Details for the file fasta2a-0.5.0.tar.gz.

File metadata

  • Download URL: fasta2a-0.5.0.tar.gz
  • Upload date:
  • Size: 1.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for fasta2a-0.5.0.tar.gz
Algorithm Hash digest
SHA256 0bca45f675fb3354ae6cd0e6dd0be1d504ee135b8e802b4058fb3485521f61e9
MD5 cea9f45d9de71fa57a5096f22ca64293
BLAKE2b-256 5d2af9d212026bdc74068ef9aef493a2b37ce0d4201694d158180759e07489b5

See more details on using hashes here.

Provenance

The following attestation bundles were made for fasta2a-0.5.0.tar.gz:

Publisher: publish.yml on pydantic/fasta2a

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fasta2a-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: fasta2a-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 25.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for fasta2a-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 806f4bbd6cd2858ca631d47e75f3bbf4746ff0752ccca38edbfe85930c4ffbe2
MD5 ba7612b64ed1fb8dcd2dae58cbc00ce8
BLAKE2b-256 c508d25f303013a04e2bec68ed97c4f4f85ad9c178fc582e8e4345147fd141fb

See more details on using hashes here.

Provenance

The following attestation bundles were made for fasta2a-0.5.0-py3-none-any.whl:

Publisher: publish.yml on pydantic/fasta2a

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page