Enterprise-grade AI agent reliability monitoring and autonomous remediation
Project description
Aigie Python SDK
Official Python SDK for Aigie -- AI agent monitoring, tracing, and autonomous remediation.
95% of AI agents never reach production due to context drift, tool errors, and runtime instability. Aigie provides the infrastructure that makes autonomous AI reliable and production-grade:
- Detects context drift and errors before they impact users
- Fixes issues automatically through self-healing workflows
- Prevents failures with predictive intervention
Installation
pip install aigie
With optional integrations
# Compression (recommended for production -- 50-90% bandwidth savings)
pip install aigie[compression]
# LLM providers
pip install aigie[openai] # OpenAI
pip install aigie[anthropic] # Anthropic Claude
pip install aigie[gemini] # Google Gemini
# Agent frameworks
pip install aigie[langchain] # LangChain
pip install aigie[langgraph] # LangGraph
pip install aigie[openai-agents] # OpenAI Agents SDK
pip install aigie[claude-agent-sdk] # Anthropic Claude Agent SDK
pip install aigie[strands] # Strands Agents (AWS/Anthropic)
pip install aigie[google-adk] # Google Agent Development Kit
pip install aigie[crewai] # CrewAI multi-agent
pip install aigie[autogen] # AutoGen/AG2
pip install aigie[llamaindex] # LlamaIndex RAG
pip install aigie[dspy] # DSPy
pip install aigie[instructor] # Instructor structured outputs
pip install aigie[semantic-kernel] # Microsoft Semantic Kernel
pip install aigie[browser-use] # Browser-Use automation
pip install aigie[livekit-agents] # LiveKit real-time voice AI
pip install aigie[agno] # Agno (formerly Phidata)
pip install aigie[pipecat] # Pipecat
# Vector databases
pip install aigie[pinecone] # Pinecone
pip install aigie[qdrant] # Qdrant
pip install aigie[chromadb] # ChromaDB
pip install aigie[weaviate] # Weaviate
pip install aigie[vectordbs] # All vector DBs
# Observability
pip install aigie[opentelemetry] # OpenTelemetry inbound
pip install aigie[otlp] # OTLP export
# Everything
pip install aigie[all]
Quick start
Decorator-based tracing (recommended)
from aigie import traceable
@traceable(run_type="agent")
async def my_agent(query: str):
result = await process_query(query)
return result
result = await my_agent("What is the weather?")
Auto-instrument LLM providers
from aigie import wrap_openai, wrap_anthropic
from openai import AsyncOpenAI
from anthropic import AsyncAnthropic
# OpenAI -- all calls automatically traced with model, tokens, cost, latency
client = wrap_openai(AsyncOpenAI())
response = await client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
)
# Anthropic
client = wrap_anthropic(AsyncAnthropic())
Context manager
from aigie import Aigie, Config
config = Config(
api_url="https://api.aigie.com",
api_key="your-key",
batch_size=100,
flush_interval=5.0,
)
aigie = Aigie(config=config)
await aigie.initialize()
async with aigie.trace("My Workflow") as trace:
async with trace.span("operation", type="llm") as span:
result = await do_work()
span.set_output({"result": result})
Integrations
Agent frameworks
| Framework | Install extra | Auto-instrument |
|---|---|---|
| LangChain | langchain |
AigieCallbackHandler |
| LangGraph | langgraph |
wrap_langgraph() |
| OpenAI Agents SDK | openai-agents |
patch_openai_agents() |
| Claude Agent SDK | claude-agent-sdk |
patch_claude_agent_sdk() |
| Strands Agents | strands |
patch_strands() |
| Google ADK | google-adk |
patch_google_adk() |
| CrewAI | crewai |
patch_crewai() |
| AutoGen/AG2 | autogen |
patch_autogen() |
| LlamaIndex | llamaindex |
patch_llamaindex() |
| DSPy | dspy |
patch_dspy() |
| Instructor | instructor |
patch_instructor() |
| Semantic Kernel | semantic-kernel |
patch_semantic_kernel() |
| Browser-Use | browser-use |
patch_browser_use() |
| LiveKit Agents | livekit-agents |
patch_livekit_agents() |
| Pipecat | pipecat |
patch_pipecat() |
| Agno | agno |
patch_agno() |
LLM providers
| Provider | Wrapper |
|---|---|
| OpenAI | wrap_openai() |
| Anthropic | wrap_anthropic() |
| Google Gemini | wrap_gemini() |
| AWS Bedrock | wrap_bedrock() |
Each integration lives in sdk/aigie/integrations/<framework>/ and follows a
consistent pattern with auto-instrumentation, cost tracking, drift detection,
error detection, retry logic, and session management. See
CONTRIBUTING.md for details on adding new integrations.
Configuration
Environment variables
export AIGIE_API_URL=https://your-instance.aigie.io/api
export AIGIE_API_KEY=your-api-key
export AIGIE_BATCH_SIZE=100
export AIGIE_FLUSH_INTERVAL=5.0
Config object
from aigie import Config
config = Config(
api_url="https://api.aigie.com", # Aigie API endpoint
api_key="your-key", # API key
batch_size=100, # Events per batch (default: 10)
flush_interval=5.0, # Flush interval in seconds
enable_buffering=True, # Enable event buffering (default: True)
max_retries=3, # Retry count on failure
)
Advanced features
OpenTelemetry integration
from aigie import Aigie
from aigie.opentelemetry import setup_opentelemetry
aigie = Aigie()
await aigie.initialize()
setup_opentelemetry(aigie, service_name="my-service")
# All OTel spans now flow to Aigie
from opentelemetry import trace
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("operation"):
pass
Evaluation and scoring
from aigie import score, feedback
await score(trace_id, "accuracy", 0.95)
await feedback(trace_id, "user_feedback", "Great response!")
Prompt management
from aigie import Prompt
prompt = Prompt.chat(
name="customer_support",
messages=[{"role": "system", "content": "You are a helpful assistant."}],
version="1.0",
)
Synchronous API
from aigie import AigieSync
aigie = AigieSync()
aigie.initialize()
with aigie.trace("workflow") as trace:
with trace.span("operation") as span:
result = do_work()
span.set_output({"result": result})
API reference
Full API documentation is available at docs.aigie.io/sdk/python.
Development
# Clone and set up
git clone https://github.com/Kytte-AI/kytte-python-sdk.git
cd kytte-python-sdk
python -m venv .venv
source .venv/bin/activate
pip install -e "sdk/[dev]"
Common commands
make lint # Run ruff linter
make format # Format code with ruff
make test # Run unit tests
make test-all # Run all tests including integration
make coverage # Run tests with coverage report
make typecheck # Run mypy type checking
make check # Run all checks (lint + test)
make build # Build distribution packages
Running tests
# Unit tests
pytest tests/unit/ -v
# Integration tests (requires API keys)
pytest tests/integration/ -v
# Coverage report
pytest tests/unit/ --cov=sdk/aigie --cov-report=html --cov-report=term-missing
Publishing
Releases are published to PyPI automatically
when a GitHub release is created, via the publish.yaml workflow.
Manual publishing is also supported:
./scripts/publish-sdk.sh <version>
# e.g. ./scripts/publish-sdk.sh 0.2.39
Contributing
See CONTRIBUTING.md for development setup and guidelines.
License
MIT -- see LICENSE for details.
Documentation
Related
| Repository | Description |
|---|---|
| kytte-js-sdk | Aigie TypeScript/JavaScript SDK |
| docs-site | Documentation site |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aigie-0.2.40.tar.gz.
File metadata
- Download URL: aigie-0.2.40.tar.gz
- Upload date:
- Size: 1.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6f711bc2f03a578a4ba8b9de154a7436151097e62488b8fa2785d3dae711378f
|
|
| MD5 |
d231218d91dd89e25784664db0d40829
|
|
| BLAKE2b-256 |
303b19560cb0a44a3b7e96262a5f132d4edc158ac9e4a6870795effc18a6040f
|
File details
Details for the file aigie-0.2.40-py3-none-any.whl.
File metadata
- Download URL: aigie-0.2.40-py3-none-any.whl
- Upload date:
- Size: 1.2 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a422a1212c2780ba1d0da2e424549b686aa49a44870255f63725500654061bf6
|
|
| MD5 |
225db717b134a7551d53fec9eb5fdb27
|
|
| BLAKE2b-256 |
90a18370d94509d96f791c62dd1ac6997d35a6daacf335d35908d1d43d838562
|