SDK for creating and managing OpenAI agents with MCP server integration
Project description
Semantic Kernel Agent Factory
A comprehensive SDK for creating and managing AI agents powered by Microsoft Semantic Kernel with MCP (Model Context Protocol) server integration. Build sophisticated conversational agents with tool integration, deploy them as web services, or interact with them through a rich terminal interface.
Features
- 🤖 Agent Factory: Create and manage multiple Semantic Kernel-based agents with different configurations
- 🔗 MCP Integration: Connect agents to external tools via Model Context Protocol (stdio and streamable HTTP servers)
- 🖥️ Interactive Console: Rich terminal-based chat interface with multi-agent support powered by Textual
- 🌐 Web Service Factory: Deploy agents as HTTP/REST APIs with A2A (Agent-to-Agent) protocol support
- ⚡ Streaming Support: Real-time response streaming for both console and web interfaces
- 📊 Health Monitoring: Built-in MCP server health checks and status monitoring
- 🔧 Flexible Configuration: YAML-based configuration with environment variable support
- 🎯 Structured Outputs: Support for JSON schema-based response formatting
Installation
Basic Installation
# Install core functionality only
pip install semantic-kernel-agent-factory
Installation with Optional Features
# For console/CLI interface
pip install semantic-kernel-agent-factory[console]
# For web service deployment
pip install semantic-kernel-agent-factory[service]
# For development (includes testing, linting, and type checking tools)
pip install semantic-kernel-agent-factory[dev]
# For documentation generation
pip install semantic-kernel-agent-factory[docs]
# Install all optional features
pip install semantic-kernel-agent-factory[all]
Development Installation
For local development:
# Clone the repository
git clone https://github.com/jhzhu89/semantic-kernel-agent-factory
cd semantic-kernel-agent-factory
# Install in editable mode with development dependencies only
pip install -e ".[dev]"
# OR install with all features for comprehensive development/testing
pip install -e ".[dev-all]"
# Use the Makefile for quick setup
make install-dev # Basic development setup
make install-dev-all # Development setup with all features
Quick Start
1. Console Application
Create a configuration file config.yaml:
agent_factory:
agents:
GeneralAssistant:
name: "GeneralAssistant"
instructions: |
You are a helpful AI assistant.
Answer questions clearly and concisely.
model: "gpt-4"
model_settings:
temperature: 0.7
openai_models:
gpt-4:
model: "gpt-4"
api_key: "${OPENAI_API_KEY}"
endpoint: "${AZURE_OPENAI_ENDPOINT}"
Run the interactive console:
# Note: Requires console dependencies
# Install with: pip install semantic-kernel-agent-factory[console]
agent-factory -c config.yaml
2. Python API - Agent Factory
import asyncio
from agent_factory import AgentFactory, AgentFactoryConfig, AgentConfig
async def main():
# Create configuration
config = AgentFactoryConfig(
agents={
"assistant": AgentConfig(
name="assistant",
instructions="You are a helpful AI assistant",
model="gpt-4"
)
},
openai_models={
"gpt-4": {
"model": "gpt-4",
"api_key": "your-api-key",
"endpoint": "your-endpoint"
}
}
)
# Create and use agents
async with AgentFactory(config) as factory:
agent = factory.get_agent("assistant")
# Use the agent for conversations
asyncio.run(main())
3. Web Service Deployment
Create a service configuration service_config.yaml:
agent_factory:
agents:
ChatBot:
name: "ChatBot"
instructions: "You are a helpful chatbot"
model: "gpt-4"
openai_models:
gpt-4:
model: "gpt-4"
api_key: "${OPENAI_API_KEY}"
endpoint: "${AZURE_OPENAI_ENDPOINT}"
service_factory:
services:
ChatBot:
card:
name: "ChatBot"
description: "AI-powered chatbot service"
enable_token_streaming: true
Deploy as web service:
# Note: Requires service dependencies
# Install with: pip install semantic-kernel-agent-factory[service]
from agent_factory import AgentServiceFactory, AgentServiceFactoryConfig
import uvicorn
async def create_app():
config = AgentServiceFactoryConfig.from_file("service_config.yaml")
async with AgentServiceFactory(config) as factory:
app = await factory.create_application()
return app
if __name__ == "__main__":
uvicorn.run("main:create_app", host="0.0.0.0", port=8000)
MCP Server Integration
Connect agents to external tools using Model Context Protocol servers:
agent_factory:
agents:
ToolAgent:
name: "ToolAgent"
instructions: "You have access to various tools"
model: "gpt-4"
mcp_servers: ["time", "kubernetes"]
mcp:
# Azure AD configuration for authenticated MCP servers
auth:
azure_ad:
tenant_id: "${AZURE_TENANT_ID}"
client_id: "${AZURE_CLIENT_ID}"
client_secret: "${AZURE_CLIENT_SECRET}"
servers:
time:
type: "stdio"
command: "python"
args: ["-m", "mcp_server_time"]
kubernetes:
type: "streamable_http"
url: "https://k8s-mcp-server.example.com/mcp"
timeout: 10
auth:
enable_s2s: true
enable_user_assertion: true
scope: "a1b2c3d4-5e6f-7a8b-9c0d-1e2f3a4b5c6d/user.read"
MCP Server Authentication
The system supports two authentication mechanisms for HTTP-based MCP servers that require Azure AD authentication:
Service-to-Service (S2S) Authentication
Uses application credentials (client secret or certificate) for server-to-server communication. The system automatically adds Bearer tokens to HTTP headers when communicating with MCP servers.
mcp:
auth:
azure_ad:
tenant_id: "${AZURE_TENANT_ID}"
client_id: "${AZURE_CLIENT_ID}"
client_secret: "${AZURE_CLIENT_SECRET}"
# OR use certificate instead of client_secret:
# certificate_pem: "${AZURE_CERTIFICATE_PEM}"
servers:
user_service:
type: "streamable_http"
url: "https://user-api.example.com/mcp"
auth:
enable_s2s: true
enable_user_assertion: false
scope: "b2c3d4e5-6f7a-8b9c-0d1e-2f3a4b5c6d7e/.default"
On-Behalf-Of (OBO) Authentication
Uses user assertion tokens to perform operations on behalf of the authenticated user. The system uses Semantic Kernel filters to inject user tokens as function parameters.
mcp:
auth:
azure_ad:
tenant_id: "${AZURE_TENANT_ID}"
client_id: "${AZURE_CLIENT_ID}"
client_secret: "${AZURE_CLIENT_SECRET}"
servers:
kubernetes:
type: "streamable_http"
url: "https://k8s-mcp-server.example.com/mcp"
auth:
enable_s2s: true
enable_user_assertion: true
scope: "a1b2c3d4-5e6f-7a8b-9c0d-1e2f3a4b5c6d/user.read"
Important Notes:
- For S2S authentication: MCP servers receive Bearer tokens in the
Authorizationheader - For OBO authentication: MCP servers receive user tokens via the
user_assertionparameter - Do not include
user_assertionin your tool's input schema definition - The authentication tokens are automatically injected by the system
- Both authentication methods can be enabled simultaneously if needed
Console Features
The interactive console provides:
- Multi-Agent Chat: Switch between different agents in tabbed interface
- Real-time Streaming: See responses as they're generated
- MCP Status Monitoring: Live health checks of connected MCP servers
- Function Call Visibility: See tool calls and results in real-time
- Keyboard shortcuts:
Ctrl+Enter: Send messageCtrl+L: Clear chatF1: Toggle agent panelF2: Toggle logsCtrl+W: Close tab
Configuration
Agent Configuration
agent_factory:
agents:
MyAgent:
name: "MyAgent"
instructions: "System prompt for the agent"
model: "gpt-4"
model_settings:
temperature: 0.7
max_tokens: 2000
response_json_schema: # Optional structured output
type: "object"
properties:
answer:
type: "string"
mcp:
servers: ["tool1", "tool2"]
OpenAI Models
agent_factory:
openai_models:
gpt-4:
model: "gpt-4"
api_key: "${OPENAI_API_KEY}"
endpoint: "${AZURE_OPENAI_ENDPOINT}"
gpt-3.5-turbo:
model: "gpt-3.5-turbo"
api_key: "${OPENAI_API_KEY}"
endpoint: "${AZURE_OPENAI_ENDPOINT}"
MCP Server Types
Stdio servers (local processes):
mcp:
servers:
local_tool:
type: "stdio"
command: "python"
args: ["-m", "my_mcp_server"]
env:
DEBUG: "true"
Streamable HTTP servers (HTTP-based):
mcp:
auth:
azure_ad:
tenant_id: "${AZURE_TENANT_ID}"
client_id: "${AZURE_CLIENT_ID}"
client_secret: "${AZURE_CLIENT_SECRET}"
servers:
# Unauthenticated server
simple_tool:
type: "streamable_http"
url: "https://api.example.com/mcp"
timeout: 15
# S2S authenticated server
service_tool:
type: "streamable_http"
url: "https://service-api.example.com/mcp"
auth:
enable_s2s: true
enable_user_assertion: true
scope: "c3d4e5f6-7a8b-9c0d-1e2f-3a4b5c6d7e8f/user.read"
CLI Commands
# Start interactive chat (requires console dependencies)
agent-factory -c config.yaml
# List configured agents
agent-factory list -c config.yaml
# Enable verbose logging
agent-factory -c config.yaml --verbose
# Custom log directory
agent-factory -c config.yaml --log-dir /path/to/logs
Environment Variables
Configure using environment variables:
# OpenAI Configuration
export OPENAI_API_KEY="your-api-key"
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
# Azure AD Configuration (for MCP server authentication)
export AZURE_TENANT_ID="your-tenant-id"
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
# OR use certificate instead:
# export AZURE_CERTIFICATE_PEM="your-certificate-pem-content"
# Optional: Agent Factory Settings
export AGENT_FACTORY__MODEL_SELECTION="cost" # first, cost, latency, quality
export AGENT_FACTORY__MCP_FAILURE_STRATEGY="lenient" # strict, lenient
Examples
See the examples/ directory for:
cli_example.yaml- Console application setupagent_service_factory_config.yaml- Web service configurationweb_service.py- Web service deployment example
Development
Quick Setup
# Clone repository
git clone https://github.com/jhzhu89/semantic-kernel-agent-factory
cd semantic-kernel-agent-factory
# Install in editable mode with all development dependencies
pip install -e ".[dev]"
Available Development Tools
The [dev] extra includes:
- Testing: pytest, pytest-asyncio, pytest-cov, pytest-mock
- Code Formatting: black, isort, ruff
- Type Checking: mypy with type stubs
- Linting: flake8, ruff
- Coverage: pytest-cov for test coverage reports
Development Commands
# Run tests
pytest
# Run tests with coverage
pytest --cov=agent_factory --cov-report=html
# Format code
black .
isort .
# Lint code
ruff check .
flake8 .
# Type checking
mypy agent_factory
# Run all quality checks
make test-cov # Runs tests with coverage
make format # Formats code
make type-check # Type checking
Optional Development Features
# Install with console dependencies for development
pip install -e ".[dev,console]"
# For web service development
pip install -e ".[dev,service]"
# Install all features for development
pip install -e ".[dev,all]"
Architecture
The Semantic Kernel Agent Factory consists of several key components:
- AgentFactory: Core factory for creating and managing Semantic Kernel agents
- AgentServiceFactory: Web service wrapper that exposes agents as HTTP APIs (requires
[service]extra) - MCPProvider: Manages connections to Model Context Protocol servers
- Console Application: Terminal-based interface for interactive agent chat (requires
[console]extra) - Configuration System: YAML-based configuration with validation
Optional Components
Different installation options enable additional features:
[console]: Interactive terminal interface with Textual UI, Click CLI commands[service]: A2A-based web services, Starlette server support[docs]: Sphinx-based documentation generation[dev]: Development tools for testing, linting, and type checking
Requirements
- Python 3.10+
- Microsoft Semantic Kernel
- Azure OpenAI or OpenAI API access
- Optional: MCP-compatible tool servers
Contributing
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Install development dependencies:
pip install -e ".[dev]" - Add tests for new functionality
- Run the test suite and ensure all checks pass:
# Run tests pytest # Format code black . isort . # Lint code ruff check . flake8 . # Type checking mypy agent_factory
- Submit a pull request
Development Environment Setup
# Install with console dependencies for development
pip install -e ".[dev,console]"
# For web service development
pip install -e ".[dev,service]"
# For full development environment
pip install -e ".[dev,all]"
Project Structure
agent_factory/- Core library codetests/- Test suiteexamples/- Usage examplesdocs/- Documentation (when using[docs]extra)
Code Quality Standards
This project uses:
- Black for code formatting
- isort for import sorting
- Ruff for linting
- Flake8 for additional linting
- mypy for type checking
- pytest for testing with >80% coverage requirement
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support & Documentation
Related Projects
- Microsoft Semantic Kernel
- Model Context Protocol
- Textual - Powers the console interface
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file semantic_kernel_agent_factory-0.0.10-py3-none-any.whl.
File metadata
- Download URL: semantic_kernel_agent_factory-0.0.10-py3-none-any.whl
- Upload date:
- Size: 58.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a416d022eefd4aa8ac4a537951ad472bbba569e56d7a0ac788db511c4725b9e9
|
|
| MD5 |
17d77d0065cb08f7e202286520de8d0d
|
|
| BLAKE2b-256 |
d76406c345c9edd937f191225125f8ffdff26a9bf0fb095627ba6fbb2bf74ee0
|
Provenance
The following attestation bundles were made for semantic_kernel_agent_factory-0.0.10-py3-none-any.whl:
Publisher:
python-publish.yml on jhzhu89/semantic-kernel-agent-factory
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
semantic_kernel_agent_factory-0.0.10-py3-none-any.whl -
Subject digest:
a416d022eefd4aa8ac4a537951ad472bbba569e56d7a0ac788db511c4725b9e9 - Sigstore transparency entry: 622819931
- Sigstore integration time:
-
Permalink:
jhzhu89/semantic-kernel-agent-factory@d750ba4162424933256fd75d97b755cec6aec666 -
Branch / Tag:
refs/tags/v0.0.10 - Owner: https://github.com/jhzhu89
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@d750ba4162424933256fd75d97b755cec6aec666 -
Trigger Event:
release
-
Statement type: