Enterprise-grade AI agent reliability monitoring and autonomous remediation
Project description
Aigie Python SDK
Production-grade Python SDK for integrating Aigie monitoring and reliability infrastructure into your AI agent workflows.
Unlike traditional observability tools that only monitor, Aigie:
- DETECTS context drift and errors before they impact users
- FIXES issues automatically through self-healing workflows
- PREVENTS failures with predictive intervention
๐ฏ Why Aigie?
95% of AI agents never reach production due to context drift, tool errors, and runtime instability. Aigie provides the infrastructure that makes autonomous AI reliable and production-grade.
Key Differentiators
| Feature | Aigie | Traditional Tools |
|---|---|---|
| Context drift detection | โ | โ |
| Auto-error correction | โ | โ |
| Production guardrails | โ | โ |
| Self-healing workflows | โ | โ |
| Reliability scoring | โ | โ |
| Predictive prevention | โ | โ |
โจ Features
- ๐ Event Buffering: 10-100x performance improvement with batch uploads
- ๐ฏ Decorator Support: 50%+ less boilerplate code
- โ๏ธ Flexible Configuration: Config class with sensible defaults
- ๐ Automatic Retries: Exponential backoff with configurable policies
- ๐ Framework Integrations: LangChain, LangGraph, Browser-Use, CrewAI, AutoGen, and more
- ๐ Production Ready: Handles network failures, race conditions, and more
- ๐ Auto-Instrumentation: Automatic tracing for OpenAI, Anthropic, Gemini, and more
- ๐ฐ Cost Tracking: Built-in token counting and cost calculation
- ๐ Evaluation System: Comprehensive scoring and evaluation framework
- ๐ก๏ธ Safety Metrics: PII detection, toxicity, bias, prompt injection scanning
๐ฆ Installation
Basic Installation
pip install aigie
With Optional Features
# Compression (recommended for production - 50-90% bandwidth savings)
pip install aigie[compression]
# LLM Provider integrations
pip install aigie[openai] # OpenAI wrapper
pip install aigie[anthropic] # Anthropic Claude
pip install aigie[gemini] # Google Gemini
# Agent Framework integrations
pip install aigie[langchain] # LangChain integration
pip install aigie[langgraph] # LangGraph integration
pip install aigie[browser-use] # Browser-Use integration
pip install aigie[crewai] # CrewAI multi-agent
pip install aigie[autogen] # AutoGen/AG2 conversations
pip install aigie[llamaindex] # LlamaIndex RAG
pip install aigie[openai-agents] # OpenAI Agents SDK
pip install aigie[dspy] # DSPy modular LLM
pip install aigie[claude-agent-sdk] # Anthropic Claude Agent SDK
pip install aigie[strands] # Strands Agents SDK (AWS/Anthropic)
pip install aigie[instructor] # Instructor structured outputs
pip install aigie[semantic-kernel] # Microsoft Semantic Kernel
# Vector DB integrations
pip install aigie[pinecone] # Pinecone
pip install aigie[qdrant] # Qdrant
pip install aigie[chromadb] # ChromaDB
pip install aigie[weaviate] # Weaviate
pip install aigie[vectordbs] # All vector DBs
# Observability
pip install aigie[opentelemetry] # OpenTelemetry support
# All features
pip install aigie[all]
๐ Quick Start
Option 1: Decorator (Recommended - 50% less code!)
from aigie import traceable
@traceable(run_type="agent")
async def my_agent(query: str):
# Your agent logic here
result = await process_query(query)
return result
# Use it normally
result = await my_agent("What is the weather?")
Option 2: Context Manager (Traditional)
from aigie import Aigie
aigie = Aigie()
await aigie.initialize()
async with aigie.trace("My Workflow") as trace:
async with trace.span("operation", type="llm") as span:
result = await do_work()
span.set_output({"result": result})
Option 3: With Configuration
from aigie import Aigie, Config
config = Config(
api_url="https://api.aigie.com",
api_key="your-key",
batch_size=100, # Buffer 100 events before sending
flush_interval=5.0 # Or flush every 5 seconds
)
aigie = Aigie(config=config)
await aigie.initialize()
๐ง Configuration
Environment Variables
export AIGIE_API_URL=http://your-aigie-instance:8000/api
export AIGIE_API_KEY=your-api-key-here
export AIGIE_BATCH_SIZE=100
export AIGIE_FLUSH_INTERVAL=5.0
Config Object
from aigie import Config
config = Config(
api_url="https://api.aigie.com",
api_key="your-key",
batch_size=100,
flush_interval=5.0,
enable_buffering=True, # Default: True
max_retries=3
)
๐ Performance
Before (No Buffering)
- 1000 spans = 1000+ API calls
- ~30 seconds total time
- High network overhead
After (With Buffering)
- 1000 spans = 2-10 API calls
- ~0.5 seconds total time
- 99%+ reduction in API calls
๐ Integrations
LLM Providers
from aigie import wrap_openai, wrap_anthropic, wrap_gemini
# OpenAI
from openai import AsyncOpenAI
client = AsyncOpenAI()
traced_client = wrap_openai(client)
# Anthropic
from anthropic import AsyncAnthropic
client = AsyncAnthropic()
traced_client = wrap_anthropic(client)
# Google Gemini
import google.generativeai as genai
traced_genai = wrap_gemini(genai)
LangChain
from aigie.callback import AigieCallbackHandler
from langchain.callbacks import AsyncCallbackManager
handler = AigieCallbackHandler()
manager = AsyncCallbackManager([handler])
# Use with your LangChain chains
chain = LLMChain(llm=llm, callback_manager=manager)
LangGraph
from aigie import wrap_langgraph
# Wrap your LangGraph workflow
traced_workflow = wrap_langgraph(workflow)
Strands Agents
from aigie.integrations.strands import patch_strands
from strands import Agent
# Auto-instrumentation (recommended)
patch_strands()
# Now all agents are automatically traced
agent = Agent(tools=[...])
result = agent("What is the capital of France?")
# Or manually register handler
from aigie.integrations.strands import StrandsHandler
handler = StrandsHandler(trace_name="my-agent")
agent = Agent(tools=[...], hooks=[handler])
result = agent("What is the capital of France?")
๐ฏ Advanced Features
OpenTelemetry Integration
Works with any OpenTelemetry-compatible tool (Datadog, New Relic, Jaeger, etc.):
from aigie import Aigie
from aigie.opentelemetry import setup_opentelemetry
aigie = Aigie()
await aigie.initialize()
# One-line setup
setup_opentelemetry(aigie, service_name="my-service")
# Now all OTel spans automatically go to Aigie!
from opentelemetry import trace
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("operation"):
# Automatically traced
pass
Synchronous API
For non-async codebases:
from aigie import AigieSync
aigie = AigieSync()
aigie.initialize() # Blocking
with aigie.trace("workflow") as trace:
with trace.span("operation") as span:
result = do_work() # Sync code
span.set_output({"result": result})
Prompt Management
Create, version, and track prompts:
from aigie import Prompt
# Create prompt
prompt = Prompt.chat(
name="customer_support",
messages=[{"role": "system", "content": "You are a helpful assistant."}],
version="1.0"
)
# Use in trace
async with aigie.trace("support") as trace:
trace.set_prompt(prompt)
rendered = prompt.render(customer_name="John")
response = await llm.ainvoke(rendered)
Evaluation System
Automatic quality monitoring:
from aigie import score, feedback
# Score a trace
await score(trace_id, "accuracy", 0.95)
# Collect feedback
await feedback(trace_id, "user_feedback", "Great response!")
W3C Trace Context Propagation
Distributed tracing across microservices:
# Extract from incoming request
context = aigie.extract_trace_context(request.headers)
async with aigie.trace("workflow") as trace:
trace.set_trace_context(context)
# Propagate to downstream service
headers = trace.get_trace_headers()
response = await httpx.get("https://api.example.com", headers=headers)
Streaming Support
Real-time span updates:
async with aigie.trace("workflow") as trace:
async with trace.span("llm_call", stream=True) as span:
async for chunk in llm.astream("Hello"):
span.append_output(chunk) # Update in real-time
yield chunk
๐ Core Concepts
Traces and Spans
- Trace: Represents a complete workflow execution
- Span: Represents a single operation within a trace
- Hierarchy: Spans can be nested to represent complex workflows
Context Propagation
Aigie automatically propagates trace context through:
- Async function calls
- Thread boundaries
- HTTP requests (with W3C headers)
- Framework integrations
Event Buffering
Events are automatically buffered and sent in batches:
- Reduces API calls by 99%+
- Configurable batch size and flush interval
- Automatic retry on failures
๐ ๏ธ Development
Running Tests
# Install development dependencies
pip install aigie[dev]
# Run tests
pytest
# Run with coverage
pytest --cov=aigie --cov-report=html
Code Quality
# Format code
black aigie tests
# Sort imports
isort aigie tests
# Lint
ruff check aigie tests
# Type check
mypy aigie
๐ Documentation
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Links
- Documentation: https://docs.aigie.io
- Source: https://github.com/Kytte-AI/kytte-python-sdk
- Issue Tracker: https://github.com/Kytte-AI/kytte-python-sdk/issues
- Changelog: https://github.com/Kytte-AI/kytte-python-sdk/blob/main/CHANGELOG.md
๐ฌ Support
For support, email support@aigie.io or open an issue on GitHub.
Transform experimental AI pilots into dependable production systems ๐
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aigie-0.2.28.tar.gz.
File metadata
- Download URL: aigie-0.2.28.tar.gz
- Upload date:
- Size: 920.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7ee6d295dbe0f6f906cc61c8b6921dc723620e805db253c0829b344c7e5f5358
|
|
| MD5 |
b976f78d8c51014ab37045258b033d8c
|
|
| BLAKE2b-256 |
a54c0839a70c38bec956c1d83b4500e4898d1350774dc3b36796f678ee5ef46b
|
File details
Details for the file aigie-0.2.28-py3-none-any.whl.
File metadata
- Download URL: aigie-0.2.28-py3-none-any.whl
- Upload date:
- Size: 1.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bfb23f0db7609470df81c72387a7a1d49c1f35d0c918b2d5cad210af68b4bc8e
|
|
| MD5 |
23b31610424f42d7f51f48eb08ac1802
|
|
| BLAKE2b-256 |
0fd1c8d94db96cc8e5a2d0739646f73334237edf8e7ff2fd6ce3f0523debbd10
|