A FastAPI extension for building API-first AI agent services
Project description
Agent Gateway
A FastAPI extension for building API-first AI agent services. Define agents, tools, and skills as markdown files, then serve them as a production-ready API with authentication, persistence, scheduling, notifications, and more.
Quick Start
pip install agents-gateway[all]
# Scaffold a new project
agents-gateway init myproject
cd myproject
# Start the server
agents-gateway serve
Your agent API is now running at http://localhost:8000 with interactive docs at /docs.
Define an Agent
Create a markdown file at workspace/agents/assistant/AGENT.md:
---
description: A helpful assistant that answers questions
skills:
- general-tools
memory:
enabled: true
---
You are a helpful assistant. Answer questions clearly and concisely.
That's it — the agent is now available via the API.
Add a Tool
File-based tool
Create workspace/tools/http-example/TOOL.md:
---
name: http-example
description: Make an HTTP GET request and return the response
parameters:
url:
type: string
description: The URL to fetch
required: true
---
Add a handler in workspace/tools/http-example/handler.py:
import httpx
async def handler(url: str) -> str:
async with httpx.AsyncClient() as client:
resp = await client.get(url)
return resp.text
Code-based tool
Register tools directly in Python:
from agent_gateway import Gateway
gw = Gateway(workspace="./workspace")
@gw.tool(agent="assistant")
def add_numbers(a: float, b: float) -> float:
"""Add two numbers together."""
return a + b
Use the API
# Invoke an agent (single-turn)
curl -X POST http://localhost:8000/v1/agents/assistant/invoke \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $API_KEY" \
-d '{"message": "What is 2 + 3?"}'
# Chat with an agent (multi-turn)
curl -X POST http://localhost:8000/v1/agents/assistant/chat \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $API_KEY" \
-d '{"message": "Hello!"}'
Features
- Markdown-defined agents — Define agents, tools, and skills as markdown files with YAML frontmatter
- Multi-LLM support — Use any model supported by LiteLLM (OpenAI, Gemini, Anthropic, Ollama, etc.)
- Built-in authentication — API key and OAuth2/JWT auth out of the box
- Persistence — SQLite or PostgreSQL storage for conversations, executions, and audit logs
- Dashboard — Built-in web dashboard for monitoring agents, executions, and conversations
- Scheduling — Cron-based agent scheduling via APScheduler
- Notifications — Slack and webhook notification backends with per-agent rules
- Async execution — Queue-based async processing with Redis or RabbitMQ
- Telemetry — OpenTelemetry instrumentation with console or OTLP export
- Structured output — Pydantic model or JSON Schema output validation
- Agent memory — Automatic memory extraction and recall across conversations
- Streaming — Server-sent events (SSE) for real-time chat responses
- Input/output schemas — JSON Schema validation for agent inputs and outputs
- CLI — Project scaffolding, agent listing, and dev server via
agents-gatewayCLI - Lifecycle hooks —
before_invoke,after_invoke,on_errorhooks for custom logic - Sub-app mounting — Mount into an existing FastAPI app with
gw.mount_to(app, path="/ai")— full feature parity
Sub-App Mounting
Mount the gateway into an existing FastAPI app with full feature parity — dashboard, auth, OAuth2, static assets, and all background subsystems work identically:
from fastapi import FastAPI
from agent_gateway import Gateway
app = FastAPI(title="My App")
gw = Gateway(workspace="./workspace")
gw.use_api_keys([{"name": "dev", "key": "secret", "scopes": ["*"]}])
gw.use_dashboard(auth_username="user", auth_password="pass",
admin_username="admin", admin_password="admin")
gw.mount_to(app, path="/ai")
# Your routes at /
# Gateway API at /ai/v1/...
# Dashboard at /ai/dashboard/
See the mounting guide for details.
Configuration
Configure your gateway with workspace/gateway.yaml:
server:
port: 8000
model:
default: "gemini/gemini-2.0-flash"
temperature: 0.1
memory:
enabled: true
Or configure programmatically:
from agent_gateway import Gateway
gw = Gateway(
workspace="./workspace",
title="My Agent Service",
)
# Fluent API for backends
gw.use_api_key_auth(api_key="your-key")
gw.use_sqlite("sqlite+aiosqlite:///data.db")
gw.use_slack_notifications(bot_token="xoxb-...", default_channel="#alerts")
Installation Extras
Install only what you need:
pip install agents-gateway[sqlite] # SQLite persistence
pip install agents-gateway[postgres] # PostgreSQL persistence
pip install agents-gateway[redis] # Redis queue backend
pip install agents-gateway[rabbitmq] # RabbitMQ queue backend
pip install agents-gateway[oauth2] # OAuth2/JWT authentication
pip install agents-gateway[slack] # Slack notifications
pip install agents-gateway[dashboard] # Web dashboard
pip install agents-gateway[otlp] # OTLP telemetry export
pip install agents-gateway[all] # Everything
Documentation
Full documentation is available at vince-nyanga.github.io/agents-gateway.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agents_gateway-0.2.8.tar.gz.
File metadata
- Download URL: agents_gateway-0.2.8.tar.gz
- Upload date:
- Size: 2.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
867938ac8ea410f903d8c42e20da49a91410248deec37470324ce600bb94bf6f
|
|
| MD5 |
971d710c77b26bfaac28761ad0eaa06f
|
|
| BLAKE2b-256 |
15101e303d0840561c1cec9d6bf1f2b2d397a7f18cc52e612e250220f80db783
|
File details
Details for the file agents_gateway-0.2.8-py3-none-any.whl.
File metadata
- Download URL: agents_gateway-0.2.8-py3-none-any.whl
- Upload date:
- Size: 721.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7d77dd6d21ceca5484b351f3702873d15c81a3ab59ce5948fdc13082eece6784
|
|
| MD5 |
efea7407b43a6efa84a5b765415cdf7a
|
|
| BLAKE2b-256 |
9154aa800db55a003f5d71d078c67b08bd0172aac1d719a1d47d1228449895ff
|