Skip to main content

AI Agent Development Acceleration Kit — build, run, and orchestrate intelligent agents with a production‑ready Agent‑to‑Agent (A2A) runtime.

Project description

Aigency

AI Agent Development Acceleration Kit — build, run, and orchestrate intelligent agents with a production‑ready Agent‑to‑Agent (A2A) runtime.

Aigency provides primitives and utilities to define agents via simple YAML, instantiate them programmatically, and serve them over HTTP using the A2A server. It is designed to be modular, observable, and extensible.

  • Python: >= 3.12
  • PyPI package: aigency
  • Core deps: a2a-sdk, pyyaml, litellm, PyJWT, google-adk

Features

  • Config‑first agents: define agent behavior, skills, tools, and model in YAML
  • Agent generator: instantiate agents, build agent cards, and executors programmatically
  • A2A integration: serve agents over HTTP with Starlette‑based A2A server
  • MCP‑friendly: integrate external tools/services via Model Context Protocol (optional)
  • Observability: compatible with Phoenix and A2A Inspector for tracing and debugging
  • Docker‑friendly: used across example demos and containers

Installation

pip install aigency

Requires Python 3.12+.

Quickstart

Minimal example for a single agent (no MCP) that responds in the user’s language.

  1. Create an agent config file (e.g., agent_config.yaml):
metadata:
  name: hello_agent
  description: A simple example agent that greets and answers briefly.
  version: 1.0.0

service:
  url: http://hello-agent:8080
  capabilities:
    streaming: true
  interface:
    default_input_modes: [text, text/plain]
    default_output_modes: [text, text/plain]

agent:
  model:
    name: gemini-2.0-flash

  instruction: |
      """
      You are a friendly, concise assistant. Always reply in the same language as the user.
      Keep responses short and helpful.
      """

  skills:
    - id: greet
      name: Greet
      description: Greets users and offers help
      examples:
        - "Hello! How can I help you today?"
  1. Run a tiny A2A app (e.g., app.py):
import os
import uvicorn
from a2a.server.apps import A2AStarletteApplication
from a2a.server.request_handlers import DefaultRequestHandler
from a2a.server.tasks import InMemoryTaskStore
from aigency.agents.generator import AgentA2AGenerator
from aigency.utils.config_service import ConfigService

CONFIG_PATH = os.path.join(os.path.dirname(__file__), "agent_config.yaml")

config_service = ConfigService(config_file=CONFIG_PATH)
agent_config = config_service.config

agent = AgentA2AGenerator.create_agent(agent_config=agent_config)
agent_card = AgentA2AGenerator.build_agent_card(agent_config=agent_config)
executor = AgentA2AGenerator.build_executor(agent=agent, agent_card=agent_card)

request_handler = DefaultRequestHandler(
    agent_executor=executor,
    task_store=InMemoryTaskStore(),
)
app = A2AStarletteApplication(
    agent_card=agent_card,
    http_handler=request_handler,
).build()

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8080)
  1. Start the server:
python app.py

Then open http://localhost:8080 to interact via the A2A HTTP interface or connect a compatible client.

Using Models & Providers

Aigency integrates with LLM providers via its dependencies. For Google Gemini models:

  • Use API key (Google AI Studio):
    • GEMINI_API_KEY=your_gemini_api_key
    • GOOGLE_GENAI_USE_VERTEXAI=FALSE
  • Or use Vertex AI (requires additional env like project/region and credentials):
    • GOOGLE_GENAI_USE_VERTEXAI=TRUE

Set these environment variables before running your app if you use Gemini‑based models.

Configuration Reference (YAML)

Common top‑level sections:

  • metadata: name, description, version
  • service: url, capabilities, interface defaults
  • agent:
    • model: model name (e.g., gemini-2.0-flash)
    • instruction: system prompt/persona
    • skills: list of skills with id, name, description, and examples
    • tools: optional integrations (e.g., MCP tools)
  • observability: optional Phoenix/A2A Inspector configuration

Example of adding an MCP tool:

tools:
  - type: mcp
    name: sample_mcp
    description: Example MCP tool
    mcp_config:
      url: sample-mcp-service
      port: 8080
      path: /mcp/

Examples & Demos

Explore ready‑to‑run demos built with Aigency:

Documentation site:

Observability

Aigency‑based apps can be observed with:

  • Phoenix dashboard (tracing/metrics)
  • A2A Inspector (agent/task introspection)

Refer to the demo repositories for docker‑compose setups that launch these services.

Development

  • Python 3.12+
  • Install dev deps and run tests as usual; for versioning helpers, see scripts/version_manager.py in this repo.

License

This project’s license is provided in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aigency-0.1.1rc139755297.tar.gz (26.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aigency-0.1.1rc139755297-py3-none-any.whl (33.5 kB view details)

Uploaded Python 3

File details

Details for the file aigency-0.1.1rc139755297.tar.gz.

File metadata

  • Download URL: aigency-0.1.1rc139755297.tar.gz
  • Upload date:
  • Size: 26.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aigency-0.1.1rc139755297.tar.gz
Algorithm Hash digest
SHA256 f8a6086feb8cefb62666b188aafbe4333d9c779a7af52de630185693f545dbfe
MD5 82cc72d18eab8e0bbfd2fcfca7eb074f
BLAKE2b-256 233258481d0b3021dd1073f7f879cd8f5d2b102898302f37a510b8df9b81a85f

See more details on using hashes here.

Provenance

The following attestation bundles were made for aigency-0.1.1rc139755297.tar.gz:

Publisher: python-publish.yml on aigency-project/aigency-lib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file aigency-0.1.1rc139755297-py3-none-any.whl.

File metadata

File hashes

Hashes for aigency-0.1.1rc139755297-py3-none-any.whl
Algorithm Hash digest
SHA256 680becf722df18c08230cbbfd2d3a89f77ccb9e4af6a03365179d3c3548bb5f5
MD5 437261942d9f27457153fb18cebacdde
BLAKE2b-256 0dbd9fe64aadf8d1d7387c6e6bad90b3c9f92e597166a06bd7e98b111b5a0f37

See more details on using hashes here.

Provenance

The following attestation bundles were made for aigency-0.1.1rc139755297-py3-none-any.whl:

Publisher: python-publish.yml on aigency-project/aigency-lib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page