Skip to main content

Enterprise-grade AI agent reliability monitoring and autonomous remediation

Project description

Aigie Python SDK

Python Version License PyPI Version CI

Production-grade Python SDK for integrating Aigie monitoring and reliability infrastructure into your AI agent workflows.

Unlike traditional observability tools that only monitor, Aigie:

  • DETECTS context drift and errors before they impact users
  • FIXES issues automatically through self-healing workflows
  • PREVENTS failures with predictive intervention

๐ŸŽฏ Why Aigie?

95% of AI agents never reach production due to context drift, tool errors, and runtime instability. Aigie provides the infrastructure that makes autonomous AI reliable and production-grade.

Key Differentiators

Feature Aigie Traditional Tools
Context drift detection โœ… โŒ
Auto-error correction โœ… โŒ
Production guardrails โœ… โŒ
Self-healing workflows โœ… โŒ
Reliability scoring โœ… โŒ
Predictive prevention โœ… โŒ

โœจ Features

  • ๐Ÿš€ Event Buffering: 10-100x performance improvement with batch uploads
  • ๐ŸŽฏ Decorator Support: 50%+ less boilerplate code
  • โš™๏ธ Flexible Configuration: Config class with sensible defaults
  • ๐Ÿ”„ Automatic Retries: Exponential backoff with configurable policies
  • ๐Ÿ”— Framework Integrations: LangChain, LangGraph, Browser-Use, CrewAI, AutoGen, and more
  • ๐Ÿ“Š Production Ready: Handles network failures, race conditions, and more
  • ๐Ÿ” Auto-Instrumentation: Automatic tracing for OpenAI, Anthropic, Gemini, and more
  • ๐Ÿ’ฐ Cost Tracking: Built-in token counting and cost calculation
  • ๐Ÿ“ˆ Evaluation System: Comprehensive scoring and evaluation framework
  • ๐Ÿ›ก๏ธ Safety Metrics: PII detection, toxicity, bias, prompt injection scanning

๐Ÿ“ฆ Installation

Basic Installation

pip install aigie

With Optional Features

# Compression (recommended for production - 50-90% bandwidth savings)
pip install aigie[compression]

# LLM Provider integrations
pip install aigie[openai]      # OpenAI wrapper
pip install aigie[anthropic]   # Anthropic Claude
pip install aigie[gemini]      # Google Gemini

# Agent Framework integrations
pip install aigie[langchain]   # LangChain integration
pip install aigie[langgraph]   # LangGraph integration
pip install aigie[browser-use] # Browser-Use integration
pip install aigie[crewai]      # CrewAI multi-agent
pip install aigie[autogen]     # AutoGen/AG2 conversations
pip install aigie[llamaindex]  # LlamaIndex RAG
pip install aigie[openai-agents]  # OpenAI Agents SDK
pip install aigie[dspy]        # DSPy modular LLM
pip install aigie[claude-agent-sdk]  # Anthropic Claude Agent SDK
pip install aigie[strands]  # Strands Agents SDK (AWS/Anthropic)
pip install aigie[instructor]  # Instructor structured outputs
pip install aigie[semantic-kernel]  # Microsoft Semantic Kernel

# Vector DB integrations
pip install aigie[pinecone]    # Pinecone
pip install aigie[qdrant]      # Qdrant
pip install aigie[chromadb]    # ChromaDB
pip install aigie[weaviate]    # Weaviate
pip install aigie[vectordbs]   # All vector DBs

# Observability
pip install aigie[opentelemetry]  # OpenTelemetry support

# All features
pip install aigie[all]

๐Ÿš€ Quick Start

Option 1: Decorator (Recommended - 50% less code!)

from aigie import traceable

@traceable(run_type="agent")
async def my_agent(query: str):
    # Your agent logic here
    result = await process_query(query)
    return result

# Use it normally
result = await my_agent("What is the weather?")

Option 2: Context Manager (Traditional)

from aigie import Aigie

aigie = Aigie()
await aigie.initialize()

async with aigie.trace("My Workflow") as trace:
    async with trace.span("operation", type="llm") as span:
        result = await do_work()
        span.set_output({"result": result})

Option 3: With Configuration

from aigie import Aigie, Config

config = Config(
    api_url="https://api.aigie.com",
    api_key="your-key",
    batch_size=100,  # Buffer 100 events before sending
    flush_interval=5.0  # Or flush every 5 seconds
)
aigie = Aigie(config=config)
await aigie.initialize()

๐Ÿ”ง Configuration

Environment Variables

export AIGIE_API_URL=http://your-aigie-instance:8000/api
export AIGIE_API_KEY=your-api-key-here
export AIGIE_BATCH_SIZE=100
export AIGIE_FLUSH_INTERVAL=5.0

Config Object

from aigie import Config

config = Config(
    api_url="https://api.aigie.com",
    api_key="your-key",
    batch_size=100,
    flush_interval=5.0,
    enable_buffering=True,  # Default: True
    max_retries=3
)

๐Ÿ“Š Performance

Before (No Buffering)

  • 1000 spans = 1000+ API calls
  • ~30 seconds total time
  • High network overhead

After (With Buffering)

  • 1000 spans = 2-10 API calls
  • ~0.5 seconds total time
  • 99%+ reduction in API calls

๐Ÿ”Œ Integrations

LLM Providers

from aigie import wrap_openai, wrap_anthropic, wrap_gemini

# OpenAI
from openai import AsyncOpenAI
client = AsyncOpenAI()
traced_client = wrap_openai(client)

# Anthropic
from anthropic import AsyncAnthropic
client = AsyncAnthropic()
traced_client = wrap_anthropic(client)

# Google Gemini
import google.generativeai as genai
traced_genai = wrap_gemini(genai)

LangChain

from aigie.callback import AigieCallbackHandler
from langchain.callbacks import AsyncCallbackManager

handler = AigieCallbackHandler()
manager = AsyncCallbackManager([handler])

# Use with your LangChain chains
chain = LLMChain(llm=llm, callback_manager=manager)

LangGraph

from aigie import wrap_langgraph

# Wrap your LangGraph workflow
traced_workflow = wrap_langgraph(workflow)

Strands Agents

from aigie.integrations.strands import patch_strands
from strands import Agent

# Auto-instrumentation (recommended)
patch_strands()

# Now all agents are automatically traced
agent = Agent(tools=[...])
result = agent("What is the capital of France?")

# Or manually register handler
from aigie.integrations.strands import StrandsHandler

handler = StrandsHandler(trace_name="my-agent")
agent = Agent(tools=[...], hooks=[handler])
result = agent("What is the capital of France?")

๐ŸŽฏ Advanced Features

OpenTelemetry Integration

Works with any OpenTelemetry-compatible tool (Datadog, New Relic, Jaeger, etc.):

from aigie import Aigie
from aigie.opentelemetry import setup_opentelemetry

aigie = Aigie()
await aigie.initialize()

# One-line setup
setup_opentelemetry(aigie, service_name="my-service")

# Now all OTel spans automatically go to Aigie!
from opentelemetry import trace
tracer = trace.get_tracer(__name__)

with tracer.start_as_current_span("operation"):
    # Automatically traced
    pass

Synchronous API

For non-async codebases:

from aigie import AigieSync

aigie = AigieSync()
aigie.initialize()  # Blocking

with aigie.trace("workflow") as trace:
    with trace.span("operation") as span:
        result = do_work()  # Sync code
        span.set_output({"result": result})

Prompt Management

Create, version, and track prompts:

from aigie import Prompt

# Create prompt
prompt = Prompt.chat(
    name="customer_support",
    messages=[{"role": "system", "content": "You are a helpful assistant."}],
    version="1.0"
)

# Use in trace
async with aigie.trace("support") as trace:
    trace.set_prompt(prompt)
    rendered = prompt.render(customer_name="John")
    response = await llm.ainvoke(rendered)

Evaluation System

Automatic quality monitoring:

from aigie import score, feedback

# Score a trace
await score(trace_id, "accuracy", 0.95)

# Collect feedback
await feedback(trace_id, "user_feedback", "Great response!")

W3C Trace Context Propagation

Distributed tracing across microservices:

# Extract from incoming request
context = aigie.extract_trace_context(request.headers)

async with aigie.trace("workflow") as trace:
    trace.set_trace_context(context)
    
    # Propagate to downstream service
    headers = trace.get_trace_headers()
    response = await httpx.get("https://api.example.com", headers=headers)

Streaming Support

Real-time span updates:

async with aigie.trace("workflow") as trace:
    async with trace.span("llm_call", stream=True) as span:
        async for chunk in llm.astream("Hello"):
            span.append_output(chunk)  # Update in real-time
            yield chunk

๐Ÿ“š Core Concepts

Traces and Spans

  • Trace: Represents a complete workflow execution
  • Span: Represents a single operation within a trace
  • Hierarchy: Spans can be nested to represent complex workflows

Context Propagation

Aigie automatically propagates trace context through:

  • Async function calls
  • Thread boundaries
  • HTTP requests (with W3C headers)
  • Framework integrations

Event Buffering

Events are automatically buffered and sent in batches:

  • Reduces API calls by 99%+
  • Configurable batch size and flush interval
  • Automatic retry on failures

๐Ÿ› ๏ธ Development

Running Tests

# Install development dependencies
pip install aigie[dev]

# Run tests
pytest

# Run with coverage
pytest --cov=aigie --cov-report=html

Code Quality

# Format code
black aigie tests

# Sort imports
isort aigie tests

# Lint
ruff check aigie tests

# Type check
mypy aigie

๐Ÿ“– Documentation

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ”— Links

๐Ÿ’ฌ Support

For support, email support@aigie.io or open an issue on GitHub.


Transform experimental AI pilots into dependable production systems ๐Ÿš€

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aigie-0.2.38.tar.gz (978.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aigie-0.2.38-py3-none-any.whl (1.1 MB view details)

Uploaded Python 3

File details

Details for the file aigie-0.2.38.tar.gz.

File metadata

  • Download URL: aigie-0.2.38.tar.gz
  • Upload date:
  • Size: 978.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aigie-0.2.38.tar.gz
Algorithm Hash digest
SHA256 59893d541c9ca2c29582208c0a19ab8a561e2480d023cf1ac1b3750f108cab11
MD5 31f6a7d301bda6da083fcf11563b0511
BLAKE2b-256 aa8123370e9e89de58b898a6133fe0b091516b0de10890db17f27105cc9b49fc

See more details on using hashes here.

File details

Details for the file aigie-0.2.38-py3-none-any.whl.

File metadata

  • Download URL: aigie-0.2.38-py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aigie-0.2.38-py3-none-any.whl
Algorithm Hash digest
SHA256 e87de9b9a8e8edba4e5e96fc4a343c6421ba9f46073e415be841c20b37e36f0a
MD5 18bd9cc23b93d6b902f12de8b56962fb
BLAKE2b-256 181c63cee293f309347b696c7298bdc4907a228dd200276a34e165c4133cbbcd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page