Skip to main content

Enterprise-grade AI agent reliability monitoring and autonomous remediation

Project description

Aigie SDK

Production-grade Python SDK for integrating Aigie monitoring into your AI agent workflows.

✨ Features

  • 🚀 Event Buffering: 10-100x performance improvement with batch uploads
  • 🎯 Decorator Support: 50%+ less boilerplate code
  • ⚙️ Flexible Configuration: Config class with sensible defaults
  • 🔄 Automatic Retries: Exponential backoff with configurable policies
  • 🔗 LangChain Integration: Seamless callback handler
  • 📊 Production Ready: Handles network failures, race conditions, and more

Quick Start

Installation

pip install aigie

Basic Usage

Option 1: Context Manager (Traditional)

from aigie import Aigie

aigie = Aigie()
await aigie.initialize()

async with aigie.trace("My Workflow") as trace:
    async with trace.span("operation", type="llm") as span:
        result = await do_work()
        span.set_output({"result": result})

Option 2: Decorator (Recommended - 50% less code!)

from aigie import Aigie

aigie = Aigie()
await aigie.initialize()

@aigie.trace(name="my_workflow")
async def my_workflow():
    @aigie.span(name="operation", type="llm")
    async def operation():
        return await do_work()
    return await operation()

Option 3: With Configuration

from aigie import Aigie, Config

config = Config(
    api_url="https://api.aigie.com",
    api_key="your-key",
    batch_size=100,  # Buffer 100 events before sending
    flush_interval=5.0  # Or flush every 5 seconds
)
aigie = Aigie(config=config)
await aigie.initialize()

Configuration

Environment Variables

export AIGIE_API_URL=http://your-aigie-instance:8000/api
export AIGIE_API_KEY=your-api-key-here
export AIGIE_BATCH_SIZE=100
export AIGIE_FLUSH_INTERVAL=5.0

Config Object

from aigie import Config

config = Config(
    api_url="https://api.aigie.com",
    api_key="your-key",
    batch_size=100,
    flush_interval=5.0,
    enable_buffering=True,  # Default: True
    max_retries=3
)

Performance

Before (No Buffering)

  • 1000 spans = 1000+ API calls
  • ~30 seconds total time
  • High network overhead

After (With Buffering)

  • 1000 spans = 2-10 API calls
  • ~0.5 seconds total time
  • 99%+ reduction in API calls

Advanced Features

OpenTelemetry Integration

Works with any OpenTelemetry-compatible tool (Datadog, New Relic, Jaeger, etc.):

from aigie import Aigie
from aigie.opentelemetry import setup_opentelemetry

aigie = Aigie()
await aigie.initialize()

# One-line setup
setup_opentelemetry(aigie, service_name="my-service")

# Now all OTel spans automatically go to Aigie!
from opentelemetry import trace
tracer = trace.get_tracer(__name__)

with tracer.start_as_current_span("operation"):
    # Automatically traced
    pass

Synchronous API

For non-async codebases:

from aigie import AigieSync

aigie = AigieSync()
aigie.initialize()  # Blocking

with aigie.trace("workflow") as trace:
    with trace.span("operation") as span:
        result = do_work()  # Sync code
        span.set_output({"result": result})

Installation

Basic

pip install aigie

With OpenTelemetry

pip install aigie[opentelemetry]

With LangChain

pip install aigie[langchain]

All Features

pip install aigie[all]

Advanced Features (Phase 3)

W3C Trace Context Propagation

Distributed tracing across microservices:

# Extract from incoming request
context = aigie.extract_trace_context(request.headers)

async with aigie.trace("workflow") as trace:
    trace.set_trace_context(context)
    
    # Propagate to downstream service
    headers = trace.get_trace_headers()
    response = await httpx.get("https://api.example.com", headers=headers)

Prompt Management

Create, version, and track prompts:

# Create prompt
prompt = await aigie.prompts.create(
    name="customer_support",
    template="You are a helpful assistant. Customer: {customer_name}",
    version="1.0"
)

# Use in trace
async with aigie.trace("support") as trace:
    trace.set_prompt(prompt)
    rendered = prompt.render(customer_name="John")
    response = await llm.ainvoke(rendered)

Evaluation Hooks

Automatic quality monitoring:

from aigie import EvaluationHook, ScoreType

hook = EvaluationHook(
    name="accuracy",
    evaluator=accuracy_evaluator,
    score_type=ScoreType.ACCURACY
)

async with aigie.trace("workflow") as trace:
    trace.add_evaluation_hook(hook)
    result = await do_work()
    await trace.run_evaluations(expected, result)

Streaming Support

Real-time span updates:

async with aigie.trace("workflow") as trace:
    async with trace.span("llm_call", stream=True) as span:
        async for chunk in llm.astream("Hello"):
            span.append_output(chunk)  # Update in real-time
            yield chunk

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aigie-0.2.0.tar.gz (459.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aigie-0.2.0-py3-none-any.whl (539.1 kB view details)

Uploaded Python 3

File details

Details for the file aigie-0.2.0.tar.gz.

File metadata

  • Download URL: aigie-0.2.0.tar.gz
  • Upload date:
  • Size: 459.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aigie-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f45ee089b27e9c8adcf3e75e2b42de475e204cd8ba9d669ac1022880f13dd280
MD5 b6f3e2de6bfc12ad0bb778935adbf2db
BLAKE2b-256 0e680a767a3c7f830f4aded4894cb0f8c3d8f2e3b20d5e5c8a364e77846bd077

See more details on using hashes here.

Provenance

The following attestation bundles were made for aigie-0.2.0.tar.gz:

Publisher: publish-sdk.yml on NirelNemirovsky/kytte-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file aigie-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: aigie-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 539.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aigie-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7e9dc6d998ca6a86e2871f8c2bc6ee9d2825ea49a5e254c812e29b2b13f162b8
MD5 104282f33e7d74936162f1bc8cd0b42d
BLAKE2b-256 5d178f4349a12aebc7782de3862094b05df09bc224ed67fc286836efd3b1e2b4

See more details on using hashes here.

Provenance

The following attestation bundles were made for aigie-0.2.0-py3-none-any.whl:

Publisher: publish-sdk.yml on NirelNemirovsky/kytte-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page