Base Python worker extension for Azure Functions.
Project description
Azure Functions Agent Framework
A powerful, production-ready framework for building AI agents in Azure Functions with Python. Deploy scalable single agents or collaborative multi-agent systems to Azure with enterprise-grade reliability.
๐ Features
- Production-Ready Azure Functions: Deploy agents as scalable Azure Functions with full HTTP API support
- Single & Multi-Agent Architecture: Build focused single agents or collaborative multi-agent systems
- Multiple LLM Providers: OpenAI, Anthropic Claude, Google Gemini, Ollama, Azure OpenAI
- Model Context Protocol (MCP): Integrate with MCP servers for enhanced tool capabilities
- Real-time Streaming: Server-sent events (SSE) support for live responses
- Enterprise Integration: Built-in Azure services support, Key Vault, monitoring, and logging
- Developer Experience: Complete samples, local development tools, and comprehensive documentation
๐ฆ Installation
pip install azurefunctions-agent-framework
Optional Dependencies
Choose the LLM providers you need:
# For OpenAI
pip install azurefunctions-agent-framework[openai]
# For Anthropic Claude
pip install azurefunctions-agent-framework[anthropic]
# For Google Gemini
pip install azurefunctions-agent-framework[google]
# For Ollama (local models)
pip install azurefunctions-agent-framework[ollama]
# For Azure services integration (Key Vault, etc.)
pip install azurefunctions-agent-framework[azure]
# Install all LLM providers
pip install azurefunctions-agent-framework[openai,anthropic,google,ollama]
# Install everything (all providers + Azure services)
pip install azurefunctions-agent-framework[all]
๐โโ๏ธ Quick Start
The fastest way to get started is with our production-ready samples:
1. Try the Weather Bot (Single Agent)
# Clone and setup
cd samples/single-agent
cp local.settings.json.template local.settings.json
# Add your OPENAI_API_KEY and OPENWEATHER_API_KEY
# Install and run
pip install -r requirements.txt
func start
# Test it
curl -X POST http://localhost:7071/api/WeatherBot/chat \
-H "Content-Type: application/json" \
-d '{"message": "What is the weather in Seattle?"}'
2. Try the Travel Planner (Multi-Agent)
# Setup multi-agent system
cd samples/multi-agent
cp local.settings.json.template local.settings.json
# Add your API keys
# Install and run
pip install -r requirements.txt
func start
# Test different agents
curl -X POST http://localhost:7071/api/agents/FlightAgent/chat \
-H "Content-Type: application/json" \
-d '{"message": "Find flights from NYC to LAX"}'
3. Build Your Own Agent
import azure.functions as func
from azurefunctions.agents import Agent, AgentFunctionApp
from azurefunctions.agents.types import LLMConfig, LLMProvider
def my_tool(query: str) -> str:
"""Your custom tool implementation."""
return f"Processed: {query}"
# Configure LLM
llm_config = LLMConfig(
provider=LLMProvider.OPENAI,
model_name="gpt-4",
api_key="your-openai-api-key"
)
# Create agent
my_agent = Agent(
name="MyAgent",
instructions="You are a helpful assistant with custom tools.",
tools=[my_tool],
llm_config=llm_config
)
# Deploy as Azure Function
app = AgentFunctionApp(agents={"MyAgent": my_agent})
๐ง API Endpoints
Standard Agent Deployments
All standard agent deployments (single and multi-agent) use the same consistent API pattern:
POST /api/agents/{agent_name}/chat # Chat with any agent
GET /api/agents/{agent_name}/info # Get agent information
GET /api/agents # List all available agents
GET /api/health # Health check
A2A Protocol Deployments
Agent-to-Agent (A2A) protocol deployments follow the A2A specification and use JSON-RPC 2.0 over HTTP:
POST {agent_url} # JSON-RPC endpoint for all A2A methods
GET /.well-known/agent.json # Agent Card discovery (A2A spec)
GET /api/agents # List all available agents (framework)
GET /api/health # Health check (framework)
A2A JSON-RPC Methods:
message/send- Send messages to the agentmessage/stream- Send messages with streaming responsestasks/get- Get task statustasks/cancel- Cancel tasks- Push notification configuration methods
Single Agent Example (Weather Bot)
# Chat with the agent
curl -X POST http://localhost:7071/api/agents/WeatherBot/chat \
-H "Content-Type: application/json" \
-d '{"message": "What is the weather in Tokyo?"}'
# Get agent info
curl http://localhost:7071/api/agents/WeatherBot/info
# List agents (will show 1 agent)
curl http://localhost:7071/api/agents
# Health check
curl http://localhost:7071/api/health
Multi-Agent Example (Travel Planner)
# Chat with flight agent
curl -X POST http://localhost:7071/api/agents/FlightAgent/chat \
-H "Content-Type: application/json" \
-d '{"message": "Find flights from Seattle to Tokyo"}'
# Chat with hotel agent
curl -X POST http://localhost:7071/api/agents/HotelAgent/chat \
-H "Content-Type: application/json" \
-d '{"message": "Find hotels in Tokyo"}'
# List all agents (will show multiple agents)
curl http://localhost:7071/api/agents
# Health check
curl http://localhost:7071/api/health
Benefits of Unified Routing:
- Same API pattern works for single and multi-agent deployments
- Easy to migrate from single to multi-agent (just add more agents)
- Predictable and consistent for developers
- Tools and integrations work across different deployment modes
๐๏ธ Framework Architecture
The Azure Functions Agent Framework follows a clean, modular architecture that separates concerns and enables flexible deployment patterns.
Framework Architecture Overview
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Azure Functions Agent Framework โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Application Layer โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ AgentFunctionApp (HTTP Host) โ Custom Triggers โ Manual Integration โ
โ โข Standard Endpoints โ โข Event-driven โ โข Direct Runner Usage โ
โ โข Multi-Agent Routing โ โข Message Queues โ โข Testing & Automation โ
โ โข A2A Protocol Support โ โข Custom Logic โ โข Programmatic Access โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Execution Layer โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Runner (Agent Execution Abstraction) โ
โ โข Request/Response Normalization โข Multi-Agent Handoffs โข Framework Agnostic โ
โ โข Async/Sync Execution โข Conversation Management โข Testing Support โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Agent Layer โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Agent (Base) โ ReflectionAgent โ Custom Agent Types (Future) โ
โ โข Core Capabilities โ โข Self-Evaluation โ โข Specialized Behaviors โ
โ โข Tool Management โ โข Iterative Improve โ โข Domain-Specific Logic โ
โ โข LLM Integration โ โข Quality Assessmentโ โข Extended Functionality โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Core Framework Components โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Tool Registry โ Handoff Engine โ Control Flow Manager โ Request/Response โ
โ โข Function Tools โ โข Multi-Agent โ โข Conversation State โ โข Type Safety โ
โ โข MCP Integration โ โข Swarm Pattern โ โข Session Management โ โข Serialization โ
โ โข Schema Generationโ โข Coordinator โ โข Context Tracking โ โข Validation โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Integration Layer โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ LLM Providers โ Tool Integration โ External Services โ
โ โโโโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโโโโโโโโ
โ โ โข OpenAI โ โ โ MCP Protocol โ โ โ โข Azure Services โโ
โ โ โข Anthropic Claude โ โ โ โข STDIO Servers โ โ โ โข Key Vault โโ
โ โ โข Google Gemini โ โ โ โข SSE Servers โ โ โ โข Cosmos DB โโ
โ โ โข Azure OpenAI โ โ โ โข HTTP Servers โ โ โ โข Service Bus โโ
โ โ โข Ollama (Local) โ โ โ โ โ โ โข Storage โโ
โ โโโโโโโโโโโโโโโโโโโโโโโ โ โ Function Tools โ โ โโโโโโโโโโโโโโโโโโโโโโโโ
โ โ โ โข Python Functions โ โ โ
โ โ โ โข Async/Sync Support โ โ โ
โ โ โ โข Auto Schema Gen โ โ โ
โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Alternative Framework Support (Future) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Azure Functions โ Semantic Kernel โ OpenAI Agents SDK โ
โ Agent Framework โ Integration โ Integration โ
โ (Current) โ (Planned) โ (Planned) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Architecture Benefits
Layered Separation of Concerns
- Application Layer: HTTP handling, routing, and deployment patterns
- Execution Layer: Framework-agnostic agent execution and handoff management
- Agent Layer: AI agent logic and specialized behaviors
- Core Framework: Reusable components for tool management and workflow control
- Integration Layer: External service connections and protocol implementations
Pluggable Components
- LLM Providers: Easy switching between OpenAI, Claude, Gemini, etc.
- Tool Systems: Both function tools and MCP server integration
- Deployment Modes: Azure Functions, custom triggers, or direct programmatic use
- Agent Types: Base Agent, ReflectionAgent, and extensible custom types
Future Extensibility
- Framework Interop: Planned support for Semantic Kernel and OpenAI Agents SDK
- Protocol Standards: MCP compliance enables broad tool ecosystem integration
- Custom Agent Types: Architecture supports specialized agent implementations
- Multiple Hosting: Framework designed to work beyond just Azure Functions
Core Components
1. AgentFunctionApp - The Function Host
AgentFunctionApp is the Azure Functions hosting layer that manages HTTP endpoints, routing, and agent lifecycle:
from azurefunctions.agents import AgentFunctionApp, AgentMode
# Single-agent deployment
app = AgentFunctionApp(
agents={"WeatherBot": weather_agent},
mode=AgentMode.AZURE_FUNCTION_AGENT
)
# Multi-agent deployment
app = AgentFunctionApp(
agents={
"FlightAgent": flight_agent,
"HotelAgent": hotel_agent,
"WeatherAgent": weather_agent
},
mode=AgentMode.AZURE_FUNCTION_AGENT
)
Key Responsibilities:
- HTTP Endpoint Management: Automatically registers routes based on deployment mode
- Request Routing: Routes incoming requests to appropriate agents
- Authentication: Handles Azure Functions authentication levels
- Agent Lifecycle: Manages agent initialization and cleanup
- Error Handling: Provides consistent error responses across all endpoints
Deployment Modes:
AZURE_FUNCTION_AGENT: Standard HTTP endpoints for agent communicationA2A: Agent-to-Agent protocol endpoints (single-agent only)
2. Agent - The Core Agent Class
Agent is the base class that represents a single AI agent with its capabilities:
from azurefunctions.agents import Agent
agent = Agent(
name="MyAgent",
instructions="You are a helpful assistant",
tools=[custom_tool],
mcp_servers=[mcp_server],
llm_config=llm_config,
enable_conversational_agent=True
)
Key Responsibilities:
- Tool Management: Registers and executes function tools and MCP tools
- LLM Integration: Handles communication with language model providers
- MCP Integration: Connects to Model Context Protocol servers
- Request Processing: Processes chat requests and manages conversation flow
- Privacy Controls: Manages information exposure via GET endpoints
3. ReflectionAgent - Advanced Self-Improving Agent
ReflectionAgent extends the base Agent with self-evaluation and improvement capabilities:
from azurefunctions.agents import ReflectionAgent
reflection_agent = ReflectionAgent(
name="SmartAgent",
instructions="You are an AI that reflects on and improves responses",
llm_config=llm_config,
# Reflection-specific parameters
max_reflection_iterations=3,
reflection_threshold=0.8,
enable_self_evaluation=True
)
Advanced Capabilities:
- Self-Evaluation: Automatically assesses response quality using configurable criteria
- Iterative Improvement: Refines responses through reflection loops
- Quality Thresholds: Stops improvement when quality targets are met
- Custom Evaluation: Supports custom evaluation functions and prompts
- Reflection Tracking: Maintains history of improvement iterations
4. Runner - Agent Execution Abstraction
Runner provides a clean, framework-agnostic abstraction for executing agents programmatically. It handles request normalization and response generation without any HTTP or Azure Functions dependencies:
from azurefunctions.agents.runner import Runner
from azurefunctions.agents.types import ChatRequest
# Create a runner for an agent
runner = Runner(agent)
# Execute with different input types
response = await runner.run("Simple string message")
response = await runner.run({"message": "Dictionary input"})
# Use structured requests (recommended)
chat_request = ChatRequest(
message="What's the weather?",
user_id="user-123",
session_id="session-456",
context={"location": "Seattle"}
)
response = await runner.run(chat_request)
Key Responsibilities:
- Input Normalization: Accepts strings, dicts, or structured Request objects
- Agent Execution: Runs agents and handles async/sync execution patterns
- Response Generation: Returns structured Response objects
- Framework Agnostic: No HTTP, Azure Functions, or web-specific dependencies
5. Request/Response Abstractions
The framework provides clean abstractions for agent input and output that separate business logic from transport concerns:
from azurefunctions.agents.types import ChatRequest, ChatResponse
# Structured request with rich metadata
request = ChatRequest(
message="What's the weather in Seattle?",
user_id="user-123",
session_id="session-456",
context={"preferred_units": "fahrenheit"}
)
# Process and get structured response
response = await runner.run(request)
# Response contains rich information
print(f"Status: {response.status}")
print(f"Response: {response.response}")
print(f"Context: {response.context}")
print(f"Error: {response.error}") # If any
# Convert to different formats
response_dict = response.to_dict()
Benefits:
- Type Safety: Full type hints and validation
- Clean Separation: Business logic separate from HTTP/transport concerns
- Testability: Easy to test without HTTP infrastructure
- Flexibility: Support different transport mechanisms (HTTP, message queues, etc.)
Architecture Patterns
Single-Agent Pattern
Best for: Focused, specialized applications
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ HTTP Request โโโโโถโ AgentFunctionApp โโโโโถโ Single Agent โ
โ โ โ (Routing) โ โ (Processing) โ
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโ
โ Tools & MCP โ
โ Servers โ
โโโโโโโโโโโโโโโโโโโ
Endpoints Generated:
POST /api/agents/{AgentName}/chat- Chat with the agentGET /api/agents/{AgentName}/info- Get agent information
Multi-Agent Pattern
Best for: Complex workflows requiring specialized agents
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ HTTP Request โโโโโถโ AgentFunctionApp โโโโโถโ Agent Router โ
โ โ โ (Multi-mode) โ โ โ
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโ
โผ โผ โผ
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ Flight Agent โ โ Hotel Agent โ โ Weather Agent โ
โ โ โ โ โ โ
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
Endpoints Generated:
POST /api/agents/{agent_name}/chat- Chat with specific agentGET /api/agents- List all agents- Custom application endpoints (optional)
๐ Multi-Agent Handoffs
The Azure Functions Agent Framework provides a powerful handoff system that enables agents to seamlessly collaborate and delegate tasks to each other. This system supports both Swarm (peer-to-peer) and Coordinator (manager-orchestrated) patterns for sophisticated multi-agent workflows.
Key Concepts
Handoff Modes
- SWARM: Agents collaborate organically, control bubbles up to user
- COORDINATOR: One agent orchestrates others and returns consolidated result
- SEQUENTIAL: Linear handoff chain between agents
- CONDITIONAL: Handoff based on dynamic conditions
Control Return Strategies
- BUBBLE_UP: Return control to user/caller (default)
- RETURN_TO_CALLER: Return to the agent that called this one
- CONTINUE_CHAIN: Continue to next agent in chain
- END_CONVERSATION: End the conversation
Quick Start: Swarm Pattern
Agents collaborate peer-to-peer with results bubbling up:
from azurefunctions.agents import Agent, AgentFunctionApp
from azurefunctions.agents.handoff import HandoffConfig, HandoffTarget, HandoffMode
# Create specialized agents
weather_agent = Agent(
name="weather",
instructions="You provide weather information",
tools=[get_weather],
handoff_config=HandoffConfig(
mode=HandoffMode.SWARM,
targets=[HandoffTarget(agent_name="temperature_converter")]
)
)
temp_agent = Agent(
name="temperature_converter",
instructions="You convert temperatures between units",
tools=[convert_temperature],
handoff_config=HandoffConfig(
mode=HandoffMode.SWARM,
targets=[HandoffTarget(agent_name="weather")]
)
)
# Deploy with handoff system
app = AgentFunctionApp(agents=[weather_agent, temp_agent])
Quick Start: Coordinator Pattern
One agent orchestrates others and returns consolidated results:
# Coordinator agent
coordinator = Agent(
name="travel_coordinator",
instructions="You coordinate travel planning across multiple agents",
handoff_config=HandoffConfig(
mode=HandoffMode.COORDINATOR,
targets=[
HandoffTarget(agent_name="flight_agent"),
HandoffTarget(agent_name="hotel_agent"),
HandoffTarget(agent_name="weather_agent")
]
)
)
# Specialist agents
flight_agent = Agent(name="flight_agent", instructions="You search for flights", tools=[search_flights])
hotel_agent = Agent(name="hotel_agent", instructions="You search for hotels", tools=[search_hotels])
weather_agent = Agent(name="weather_agent", instructions="You provide weather info", tools=[get_weather])
app = AgentFunctionApp(agents=[coordinator, flight_agent, hotel_agent, weather_agent])
Runner-Based Handoffs
The framework uses Runner objects for direct agent-to-agent communication:
# Get runners from the app
weather_runner = app.runners["weather"]
temp_runner = app.runners["temperature_converter"]
# Direct handoff between agents
async def handle_request():
# Weather agent processes initial request
weather_response = await weather_runner.run("What's the weather in Seattle?")
# Hand off to temperature converter
temp_response = await weather_runner.handoff_to(
target_agent="temperature_converter",
input_data={"celsius": 22, "target_unit": "fahrenheit"},
conversation_id="user-session-123",
reason="User requested temperature conversion"
)
return temp_response
Advanced Configuration
Conditional Handoffs
from azurefunctions.agents.handoff import HandoffConfig, HandoffTarget, HandoffMode
def needs_translation(request_data):
"""Check if the request needs translation."""
return any(keyword in request_data.get('message', '').lower()
for keyword in ['translate', 'espaรฑol', 'franรงais'])
agent = Agent(
name="multilingual_assistant",
instructions="You help with multilingual requests",
handoff_config=HandoffConfig(
mode=HandoffMode.CONDITIONAL,
targets=[
HandoffTarget(
agent_name="translator",
condition=needs_translation,
description="Hand off to translator for multilingual requests"
)
]
)
)
Context Passing
HandoffTarget(
agent_name="specialist",
context_keys=["user_preferences", "session_data"], # Pass specific context
description="Hand off with user context"
)
AI-Powered Routing
HandoffConfig(
mode=HandoffMode.COORDINATOR,
strategy=HandoffStrategy.BEST_MATCH, # AI selects best agent
enable_auto_routing=True,
routing_instructions="Choose the agent best suited for the user's request"
)
Safety Features
Loop Detection
The framework automatically prevents infinite handoff loops:
HandoffConfig(
max_hops=10, # Maximum handoffs before stopping
# Framework tracks call stack and prevents cycles
)
Validation
All handoffs are validated before execution:
# Check if handoff is possible
if runner.can_handoff_to("target_agent"):
await runner.handoff_to("target_agent", data)
HTTP API Integration
Handoffs work seamlessly with the standard HTTP API:
# Request that triggers handoffs
POST /api/agents/travel_coordinator/chat
{
"message": "Plan a trip to Tokyo with flights and hotels"
}
# Response includes handoff execution details
{
"agent": "travel_coordinator",
"response": "Complete travel plan with flights and hotels",
"handoff_path": ["travel_coordinator", "flight_agent", "hotel_agent"],
"conversation_id": "uuid-123"
}
Real-World Examples
Our samples include complete handoff implementations:
- Weather Advisory System - Swarm pattern with peer-to-peer collaboration
- Travel Coordinator - Coordinator pattern with centralized orchestration
- Customer Service Hub - Conditional routing with AI-powered agent selection
These samples demonstrate production-ready handoff patterns with complete Azure Functions deployment configurations.
Component Interaction Flow
1. Request Processing Flow
HTTP Request โ AgentFunctionApp โ Agent.process_request() โ LLM + Tools โ Response
2. Tool Execution Flow
Agent โ ToolRegistry โ [FunctionTool | MCPTool] โ Result โ LLM โ Final Response
3. Reflection Flow (ReflectionAgent)
Initial Response โ Self-Evaluation โ Reflection โ Improvement โ Quality Check โ Final Response
Extensibility Points
Custom Agent Types
Extend the base Agent class to create specialized agent behaviors:
class CustomAgent(Agent):
async def process_request(self, request_data):
# Custom pre-processing
result = await super().process_request(request_data)
# Custom post-processing
return result
Custom Tools
Register functions as tools using the decorator pattern:
@agent.tool
def my_custom_tool(param: str) -> str:
"""My custom tool description."""
return f"Processed: {param}"
MCP Server Integration
Connect to external MCP servers for enhanced capabilities:
agent.add_mcp_server(MCPServer(
name="MyMCPServer",
mode=MCPServerMode.SSE,
params=MCPServerSseParams(url="http://localhost:8080/mcp")
))
This architecture provides clear separation of concerns, enabling you to build everything from simple single-purpose agents to complex multi-agent systems with enterprise-grade reliability and scalability.
๐ Supported LLM Providers
OpenAI
from azurefunctions.agents.types import LLMConfig, LLMProvider
llm_config = LLMConfig(
provider=LLMProvider.OPENAI,
model_name="gpt-4",
api_key="your-api-key"
)
Anthropic Claude
llm_config = LLMConfig(
provider=LLMProvider.ANTHROPIC,
model_name="claude-3-sonnet-20240229",
api_key="your-anthropic-api-key"
)
Google Gemini
llm_config = LLMConfig(
provider=LLMProvider.GOOGLE,
model_name="gemini-pro",
api_key="your-google-api-key"
)
Azure OpenAI
llm_config = LLMConfig(
provider=LLMProvider.AZURE_OPENAI,
model_name="gpt-4",
endpoint="https://your-resource.openai.azure.com/",
api_key="your-azure-openai-key",
api_version="2024-02-15-preview" # or your preferred API version
)
๐ Model Context Protocol (MCP) Integration
Connect your agents to MCP servers for enhanced capabilities:
from azurefunctions.agents import Agent, MCPServer, MCPServerMode
from azurefunctions.agents import MCPServerSseParams
```python
from azurefunctions.agents import Agent, MCPServer, MCPServerMode
from azurefunctions.agents import MCPServerSseParams
from azurefunctions.agents.types import LLMConfig, LLMProvider
# Configure LLM for the agent
llm_config = LLMConfig(
provider=LLMProvider.OPENAI,
model_name="gpt-4",
api_key="your-openai-api-key"
)
# Configure MCP server (SSE mode example)
mcp_server = MCPServer(
name="CodeExecutionMCPServer",
mode=MCPServerMode.SSE,
params=MCPServerSseParams(
url="http://localhost:7072/runtime/webhooks/mcp/sse",
headers={
"Authorization": "Bearer your-mcp-api-token"
},
timeout=5.0,
sse_read_timeout=300.0
),
cache_tools_list=False
)
# Add to agent
code_agent = Agent(
name="CodeExecutionAgent",
instructions="You are a code execution agent that can run Python code to perform tasks.",
mcp_servers=[mcp_server],
llm_config=llm_config,
description="A code execution agent that can run Python code to perform tasks."
)
MCP Server Modes
The unified MCPServer supports three communication modes:
STDIO Mode (subprocess communication):
from azurefunctions.agents import MCPServer, MCPServerMode
from azurefunctions.agents import MCPServerStdioParams
mcp_server = MCPServer(
name="MyStdioServer",
mode=MCPServerMode.STDIO,
params=MCPServerStdioParams(
command="python",
args=["my_mcp_server.py"],
env={"API_KEY": "your-key"}
)
)
SSE Mode (Server-Sent Events):
from azurefunctions.agents import MCPServer, MCPServerMode
from azurefunctions.agents import MCPServerSseParams
mcp_server = MCPServer(
name="MySSEServer",
mode=MCPServerMode.SSE,
params=MCPServerSseParams(
url="http://localhost:8080/sse",
headers={"Authorization": "Bearer token"}
)
)
Streamable HTTP Mode:
from azurefunctions.agents import MCPServer, MCPServerMode
from azurefunctions.agents import MCPServerStreamableHttpParams
mcp_server = MCPServer(
name="MyHttpServer",
mode=MCPServerMode.STREAMABLE_HTTP,
params=MCPServerStreamableHttpParams(
session_url="http://localhost:8080/session"
)
)
๐ Streaming Responses
Enable real-time streaming for better user experience:
# Enable streaming in your agent
weather_agent = Agent(
name="WeatherBot",
instructions="Provide weather updates with streaming responses.",
tools=[get_weather],
llm_config=llm_config,
streaming=True # Enable SSE streaming
)
๐งช Testing Your Agents
# Test your agent locally
async def test_agent():
response = await weather_agent.chat("What's the weather in Seattle?")
print(response)
# Run the test
import asyncio
asyncio.run(test_agent())
๐ Project Structure
my-agent-app/
โโโ function_app.py # Your main Function App
โโโ agents/
โ โโโ __init__.py
โ โโโ weather_agent.py # Weather agent definition
โ โโโ tools/
โ โโโ weather_tools.py # Agent tools
โโโ host.json # Azure Functions configuration
โโโ local.settings.json # Local development settings
โโโ requirements.txt # Python dependencies
โโโ .env # Environment variables
๐ง Configuration
Environment Variables
# LLM Provider API Keys
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_API_KEY=your-google-key
# Azure OpenAI (alternative to OpenAI)
AZURE_OPENAI_API_KEY=your-azure-openai-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_API_VERSION=2024-02-15-preview
# Azure Services (optional - for Key Vault, etc.)
AZURE_CLIENT_ID=your-client-id
AZURE_CLIENT_SECRET=your-client-secret
AZURE_TENANT_ID=your-tenant-id
# MCP Configuration (optional)
MCP_SERVER_PATH=/path/to/mcp/server
Local Development
// local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python",
"OPENAI_API_KEY": "your-openai-api-key"
}
}
๐ Deployment
Deploy to Azure Functions:
# Install Azure Functions Core Tools
npm install -g azure-functions-core-tools@4
# Create a Function App
func init my-agent-app --python
cd my-agent-app
# Add your agent code
# Deploy to Azure
func azure functionapp publish my-agent-app
๐ Production-Ready Samples
Our samples/ directory contains complete, deployable Azure Functions examples:
๐ค๏ธ Single Agent - Weather Bot
Location: samples/single-agent/
A production-ready weather bot with:
- Real Weather Data: OpenWeatherMap API integration
- Error Handling: Comprehensive error handling and logging
- Health Checks: Built-in health monitoring endpoints
- Azure Functions: Complete function_app.py with HTTP triggers
- Security: API key management and rate limiting ready
cd samples/single-agent && func start
# POST /api/WeatherBot/chat - Chat with the weather bot
# GET /api/WeatherBot/info - Get agent information
# GET /api/health - Health check endpoint
โ๏ธ Multi-Agent - Travel Planner
Location: samples/multi-agent/
A collaborative multi-agent system featuring:
- FlightAgent: Flight search and booking assistance
- HotelAgent: Hotel recommendations and reservations
- BudgetAgent: Cost analysis and budget optimization
- Inter-Agent Communication: Agents can collaborate on complex requests
- Scalable Architecture: Each agent handles specialized tasks
cd samples/multi-agent && func start
# POST /api/agents/FlightAgent/chat - Flight-specific queries
# POST /api/agents/HotelAgent/chat - Hotel-specific queries
# POST /api/agents/BudgetAgent/chat - Budget analysis
# GET /api/agents - List all available agents
๐ Multi-Agent Handoff Samples
Swarm Pattern: samples/handoff-swarm/
Weather advisory system with peer-to-peer collaboration:
- Decentralized Handoffs: Agents collaborate organically
- Weather + Conversion + Advice: Three specialized agents working together
- Dynamic Flows: Conversation paths adapt based on needs
- Loop Detection: Automatic prevention of infinite handoffs
cd samples/handoff-swarm && func start
# POST /api/agents/weather/chat - Weather agent (main entry)
# POST /api/weather-swarm - Demo endpoint showing handoff flow
Coordinator Pattern: samples/handoff-coordinator/
Travel coordinator with centralized orchestration:
- Central Coordinator: TravelCoordinator manages all specialists
- Unified Results: Consolidated responses from multiple agents
- Workflow Management: Parallel and sequential processing
- Complete Travel Planning: Flights, hotels, weather, restaurants
cd samples/handoff-coordinator && func start
# POST /api/agents/travel_coordinator/chat - Main coordinator
# POST /api/travel-coordinator-demo - Demo endpoint
Conditional Pattern: samples/handoff-conditional/
Customer service hub with AI-powered routing:
- Intelligent Routing: AI analyzes requests and routes appropriately
- Customer Context: Takes into account customer history and preferences
- Automatic Escalation: Detects complex issues requiring escalation
- Multi-Specialist Support: Technical, billing, sales, and escalation teams
cd samples/handoff-conditional && func start
# POST /api/agents/customer_service/chat - Smart router
# POST /api/customer-service-demo - Demo with routing analysis
๐ Provider Examples
Location: samples/providers/
Ready-to-use integrations with major LLM providers:
- Anthropic Claude:
anthropic_claude.py - Google Gemini:
google_gemini.py - Azure OpenAI: Complete configuration examples in sample templates
๐ ๏ธ MCP Integration
Location: samples/mcp-integration/
Model Context Protocol server integration:
- Weather MCP Agent:
weather_mcp_agent.py - External tool server connections
- Enhanced capabilities through MCP protocol
โก Advanced Features
Location: samples/advanced-features/
Advanced functionality demonstrations:
- Streaming Responses:
streaming_responses.py- Server-sent events implementation - Real-time agent interactions
- Performance optimization techniques
๐ Quick Testing
Follow our Quick Test Guide to get any sample running in under 5 minutes:
# Test single agent
cd samples/single-agent
cp local.settings.json.template local.settings.json
# Add your API keys, then:
func start
# Test multi-agent system
cd samples/multi-agent
cp local.settings.json.template local.settings.json
# Add your API keys, then:
func start
๐ค Contributing
We welcome contributions! Please see our Contributing Guide for details.
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Related Projects
๐ Support
Built with โค๏ธ by the Azure Functions team
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file azurefunctions_agent_framework-0.0.1a21.tar.gz.
File metadata
- Download URL: azurefunctions_agent_framework-0.0.1a21.tar.gz
- Upload date:
- Size: 155.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1d8f00a185640906227682f669e21337ff088c12a081585b49c08963e8a4c6d1
|
|
| MD5 |
8def94fc128679b4f307a108443c1cbb
|
|
| BLAKE2b-256 |
191221cf1d53e2c3eb725d58e9812bfa04e57a2e6a8df363602d8a00779197e1
|
File details
Details for the file azurefunctions_agent_framework-0.0.1a21-py3-none-any.whl.
File metadata
- Download URL: azurefunctions_agent_framework-0.0.1a21-py3-none-any.whl
- Upload date:
- Size: 165.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d3d7732d734492b388a296d1076f8530210b2357ff1b5a8c725817829c46f1cc
|
|
| MD5 |
7419a54457c0c3eaffec18b5cb74f86e
|
|
| BLAKE2b-256 |
61864c4e5e8ebecfa044287c12ced33161a46cee84b797ade2b1bc4cbfb4e881
|