Skip to main content

Cloud Native Observability SDK for LLM applications

Project description

Observicia SDK

Observicia is a Cloud Native observability and policy control SDK for LLM applications. It provides seamless integration with CNCF native observability stack while offering comprehensive token tracking, policy enforcement, and PII protection capabilities.

Documentation License OpenTelemetry OPA

Features

  • Token Tracking and Management

    • Real-time token usage monitoring across providers
    • Stream-aware token counting
    • Token usage retention and cleanup
    • Per-session token tracking
    • Configurable data retention policies
  • LLM Backend Support

    • OpenAI
      • Chat completions (sync/async)
      • Text completions (sync/async)
      • Embeddings
      • Image generation
      • File operations
      • Streaming support
    • Ollama
      • Local model deployment
      • Chat completions
      • Text generation
      • Embeddings
      • Streaming support
    • WatsonX
      • Foundation models integration
      • Text generation
      • Chat completions
      • Parameter controls
    • Basic scaffolding for:
      • Anthropic
      • LiteLLM
  • Transaction Tracking

    • Multi-round conversation tracking
    • Transaction lifecycle management
    • Metadata and state tracking
    • Parent-child transaction relationships
    • Transaction performance metrics
  • Chat Logging and Analytics

    • Structured chat history logging
    • Conversation flow analysis
    • Interaction metrics
    • Policy compliance logging
    • Chat completion tracking
  • Telemetry Storage and Export

    • SQLite exporter for persistent telemetry storage
      • Structured schema for token usage and metrics
      • Transaction and trace correlation
      • Query-friendly format for analytics
    • Redis exporter with configurable retention
      • Time-based data retention policies
      • Real-time metrics access
      • Distributed telemetry storage
    • OpenTelemetry integration
      • Standard OTLP export support
      • Custom attribute mapping
      • Span context preservation
  • Policy Enforcement

    • Integration with Open Policy Agent (OPA)
    • Support for multiple policy evaluation levels
    • Risk level assessment (low, medium, high, critical)
    • Custom policy definition support
    • Synchronous and asynchronous policy evaluation
  • Framework Integration

    • LangChain support
      • Conversation chain monitoring
      • Chain metrics
      • Token usage across abstractions
  • Observability Features

    • OpenTelemetry integration
    • Span-based tracing for all LLM operations
    • Configurable logging (console, file, OTLP)
    • Mermaid diagram generation from telemetry data
    • Detailed request/response tracing
    • Custom attribute tracking

Quick Start

  1. Install the SDK:
pip install observicia
  1. Create a configuration file (observicia_config.yaml):
service_name: my-service
otel_endpoint: http://localhost:4317
opa_endpoint: http://localhost:8181/
policies:
  - name: pii_check
    path: policies/pii
    description: Check for PII in responses
    required_trace_level: enhanced
    risk_level: high
logging:
  file: "app.json"
  telemetry:
    enabled: true
    format: "json"
    redis:
      enabled: true
      host: "localhost"
      port: 6379
      db: 0
      key_prefix: "observicia:telemetry:"
      retention_hours: 24
  messages:
    enabled: true
    level: "INFO"
  chat:
    enabled: true
    level: "both"
    file: "chat.log"
  1. Initialize in your code:
from observicia import init
from observicia.core.context_manager import ObservabilityContext

# Required - Initialize Observicia
init()

# Optional - Set user ID for tracking
ObservabilityContext.set_user_id("user123")

# Optional - Start a conversation transaction
transaction_id = ObservabilityContext.start_transaction(
    metadata={"conversation_type": "chat"}
)

# Use with OpenAI
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)

# Or use with Ollama
import ollama
response = ollama.chat(
    model="llama2",
    messages=[{"role": "user", "content": "Hello!"}]
)

# Optional - End the transaction
ObservabilityContext.end_transaction(
    transaction_id,
    metadata={"resolution": "completed"}
)

Architecture

flowchart TB
    App[Application] --> SDK[Observicia SDK]
    subgraph LLM Backends
        OpenAI[OpenAI API]
        Ollama[Ollama Local]
        Anthropic[Anthropic API]
        LiteLLM[LiteLLM]
        WatsonX[WatsonX]
    end

    SDK --> OpenAI
    SDK --> Ollama
    SDK --> Anthropic
    SDK --> LiteLLM
    SDK --> WatsonX

    SDK --> OPA[Open Policy Agent]
    SDK --> OTEL[OpenTelemetry Collector]
    SDK --> SQLite[(SQLite)]
    SDK --> Redis[(Redis)]

    OTEL --> Jaeger[Jaeger]
    OTEL --> Prom[Prometheus]

    OPA --> PII[PII Detection Service]
    OPA --> Compliance[Prompt Compliance Service]

    subgraph Telemetry Storage
        SQLite
        Redis
    end

    style OpenAI fill:#85e,color:#fff
    style Ollama fill:#85e,color:#fff
    style WatsonX fill:#85e,color:#fff
    style Anthropic fill:#ccc,color:#666
    style LiteLLM fill:#ccc,color:#666

Example Applications

The SDK includes three example applications demonstrating different use cases:

  1. Simple Chat Application (examples/simple-chat)

    • Basic chat interface using OpenAI
    • Demonstrates token tracking and tracing
    • Shows streaming response handling
    • Includes transaction management
  2. RAG Application (examples/rag-app)

    • Retrieval-Augmented Generation example
    • Shows policy enforcement for PII protection
    • Demonstrates context tracking
    • Includes secure document retrieval
  3. LangChain Chat (examples/langchain-chat)

    • Integration with LangChain framework
    • Shows conversation chain tracking
    • Token tracking across abstractions

Deployment

Prerequisites

  • Kubernetes cluster with:
    • OpenTelemetry Collector
    • Open Policy Agent
    • Jaeger (optional)
    • Prometheus (optional)

Example Kubernetes Deployment

See the deploy/k8s directory for complete deployment manifests.

Core Components

  • Context Manager: Manages trace context, transactions and session tracking
  • Policy Engine: Handles policy evaluation and enforcement
  • Token Tracker: Monitors token usage across providers
  • Patch Manager: Manages LLM provider SDK instrumentation
  • Tracing Manager: Handles OpenTelemetry integration

Token Usage Visualization

The SDK includes sample tools to visualize token usage metrics through Grafana dashboards.

Token Usage Dashboard

Development Status

  • ✅ Core Framework
  • ✅ OpenAI Integration
  • ✅ Basic Policy Engine
  • ✅ Token Tracking
  • ✅ OpenTelemetry Integration
  • ✅ Transaction Management
  • ✅ Chat Logging
  • ✅ LangChain Support
  • 🚧 Additional Provider Support
  • 🚧 Advanced Policy Features
  • 🚧 UI Components

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

observicia-0.1.12.tar.gz (43.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

observicia-0.1.12-py3-none-any.whl (52.2 kB view details)

Uploaded Python 3

File details

Details for the file observicia-0.1.12.tar.gz.

File metadata

  • Download URL: observicia-0.1.12.tar.gz
  • Upload date:
  • Size: 43.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.11

File hashes

Hashes for observicia-0.1.12.tar.gz
Algorithm Hash digest
SHA256 ba762814f5fdd138e856ab6128e30375dea05b82581f28a8ccc9e39d89712d29
MD5 91d75075353191627a61bca28a3ce1aa
BLAKE2b-256 fec46f45aa4d9888e67e4a8833847d73314d246b48ebb13bceaccc34e5529422

See more details on using hashes here.

File details

Details for the file observicia-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: observicia-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 52.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.11

File hashes

Hashes for observicia-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 b2bcf32f29af116738a5ce64a97c9c07b8c04b16e2c153a73e927cf31b59928c
MD5 08efc465c402a72d9d53c6f649f23787
BLAKE2b-256 669289d199285e66e3ce1c72e8b2430ea005fc5202de5394e1c6c5ce467232ec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page