Lightweight Python AI agent with OpenAI tool calling, MCP support, and parallel subagents
Project description
air-agent
A lightweight Python AI Agent library. Built on the OpenAI Chat Completions API with support for tool-calling loops, MCP Server connections, parallel subagents, and streaming output. Designed to be imported directly by other Python projects.
Installation
uv add air-agent
Or in development mode:
git clone https://github.com/chldu2000/air-agent.git
cd air-agent
uv sync --group dev
Quick Start
Basic Conversation
import asyncio
from vibe_agent import Agent, AgentConfig
async def main():
agent = Agent(AgentConfig(model="gpt-4o"))
response = await agent.run("Explain quantum computing in one sentence")
print(response.content)
asyncio.run(main())
Register Local Tools
agent = Agent(AgentConfig(model="gpt-4o", api_key="sk-xxx"))
@agent.tool(name="add", description="Calculate the sum of two numbers")
async def add(a: int, b: int) -> int:
return a + b
response = await agent.run("What is 3 plus 5?")
# The agent will automatically call the add tool and return the result
Parameter types are inferred from the function signature and converted to the JSON Schema required by OpenAI tool calling.
Streaming Output
async for event in await agent.run("Write a poem about programming", stream=True):
if event.type == "text":
print(event.content, end="", flush=True)
elif event.type == "tool_call":
print(f"\n[Calling tool: {event.name}]")
elif event.type == "tool_result":
print(f"[Tool result: {event.content}]")
elif event.type == "done":
print(f"\nDone, token usage: {event.usage}")
Multi-turn Conversation
response = await agent.run("Hello", conversation_id="session-1")
response = await agent.run("What did I just say?", conversation_id="session-1")
# The second turn includes context from the first turn
Load Configuration from JSON
{
"model": "gpt-4o",
"system_prompt": "You are a coding assistant",
"mcp_servers": [
{"command": "npx", "args": ["-y", "@anthropic/mcp-server-filesystem", "/tmp"]},
{"url": "http://localhost:8080/sse"}
]
}
config = AgentConfig.from_json("agent-config.json")
agent = Agent(config)
The mcp_servers field auto-detects the transport type based on command (stdio) or url (SSE).
Load Configuration from Environment Variables
export VIBE_MODEL=gpt-4o
export VIBE_SYSTEM_PROMPT="You are an assistant"
export VIBE_MAX_ITERATIONS=30
export VIBE_MCP_SERVERS='[{"command":"npx","args":["server"]}]'
config = AgentConfig.from_env() # default VIBE_ prefix
config = AgentConfig.from_env(prefix="MYAPP_") # custom prefix
agent = Agent(config)
Supported environment variables:
| Variable | Type | Description |
|---|---|---|
VIBE_MODEL |
str | Model name |
VIBE_API_KEY |
str | API key (takes precedence over OPENAI_API_KEY) |
VIBE_BASE_URL |
str | Custom API endpoint |
VIBE_SYSTEM_PROMPT |
str | System prompt |
VIBE_MAX_ITERATIONS |
int | Max tool-calling rounds |
VIBE_TOOL_TIMEOUT |
float | Tool call timeout in seconds |
VIBE_MCP_SERVERS |
JSON | MCP server list |
VIBE_DEFAULT_HEADERS |
JSON | Custom request headers |
Connect to MCP Servers
from vibe_agent import MCPServerStdio, MCPServerSSE
agent = Agent(AgentConfig(
model="gpt-4o",
mcp_servers=[
MCPServerStdio(command="npx", args=["-y", "@anthropic/mcp-server-filesystem", "/tmp"]),
MCPServerSSE(url="http://localhost:8080/mcp"),
],
))
async with agent: # auto connect/disconnect MCP servers
response = await agent.run("List files under /tmp")
Supports both stdio and StreamableHTTP MCP transports. Once connected, tools exposed by the server are automatically registered in the agent's tool list.
Parallel Subagents
from vibe_agent import SubagentConfig
results = await agent.delegate(
tasks=[
"Analyze the code structure in src/",
"Check test coverage in tests/",
"Generate a CHANGELOG",
],
config=SubagentConfig(max_parallel=3, timeout=60),
)
for r in results:
print(f"[{r.status}] {r.content[:100]}")
Each task runs in an independent Agent instance without interference.
Configuration
AgentConfig(
model="gpt-4o", # Model name
api_key="sk-xxx", # Or set OPENAI_API_KEY env variable
base_url=None, # Custom API endpoint
system_prompt="You are an assistant", # System prompt
max_iterations=20, # Max tool-calling rounds
tool_timeout=30.0, # Single tool call timeout (seconds)
mcp_servers=[], # MCP server list
)
Project Structure
src/vibe_agent/
├── __init__.py # Public API exports
├── agent.py # Core Agent (ReAct loop + streaming)
├── config.py # Configuration dataclass
├── types.py # Response, StreamEvent, SubagentResult
├── tools/
│ ├── base.py # Tool dataclass
│ └── registry.py # Tool registry
├── mcp/
│ ├── client.py # MCP client (stdio + streamable_http)
│ └── tool_adapter.py # MCP tool → OpenAI format adapter
└── subagent.py # Parallel subagent manager
Dependencies
openai— LLM calls and tool callingmcp— MCP protocol clientpydantic— Data validation
Development
uv sync --group dev
uv run pytest tests/ -v
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file air_agent-0.1.1.tar.gz.
File metadata
- Download URL: air_agent-0.1.1.tar.gz
- Upload date:
- Size: 24.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0e8ef0a64b33ab91ec1f284914f909b2bf7b986bae1c538730bf0c58dbdc2e2d
|
|
| MD5 |
f3f82bb2780e7f75a365f7611a7cfa4a
|
|
| BLAKE2b-256 |
c4a65c9b703b9d7252063fb14c472275af9723929f1a00fe48b8c0728e0914d2
|
File details
Details for the file air_agent-0.1.1-py3-none-any.whl.
File metadata
- Download URL: air_agent-0.1.1-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
210671d2566aaf577e04e0a63cded73f55c1030d9747c56d06c141de306f9be4
|
|
| MD5 |
d835ac1fab2148187b1300c611c2c928
|
|
| BLAKE2b-256 |
8db65384e22deafaa806a7a45feb3e986ada49fc775c362a8c7b4e2e59fa3077
|