Skip to main content

Verse Python SDK

Project description

Verse Python SDK

A Python SDK for observability and tracing in AI applications. Supports decorators and context managers with automatic instrumentation for popular LLM frameworks.

Installation

pip install verse-sdk

Quick Start

from verse_sdk import verse, observe

# Initialize
verse.init(
    app_name="my-app",
    exporters=[verse.exporters.console()],
    vendor="pydantic_ai"  # Optional: auto-instrument LLM calls
)

# Option 1: Decorators (recommended)
@observe()
async def my_function(query: str):
    result = await process_query(query)
    return result

# Option 2: Context managers
async def my_function_v2(query: str):
    with verse.trace("my_function") as trace:
        trace.input(query)
        with verse.span("process_query") as span:
            result = await process_query(query)
            span.output(result)
        trace.output(result)
    return result

Table of Contents

Initialization

verse.init(
    app_name="my-app",           # Required: identifies your project
    environment="production",     # Optional: environment label
    exporters=[...],             # Required: list of exporters
    vendor="pydantic_ai",        # Optional: enables auto-instrumentation
    version="1.0.0"              # Optional: app version
)

Exporters

Console

verse.exporters.console()
verse.exporters.console({"scopes": ["agent-workflow-1"]})  # With scope filtering

Langfuse

from verse_sdk import LangfuseConfig

verse.exporters.langfuse(
    LangfuseConfig(
        host="https://cloud.langfuse.com",
        public_key="pk-...",
        private_key="sk-..."
    )
)

# Or use environment variables: LANGFUSE_HOST, LANGFUSE_PUBLIC_KEY, LANGFUSE_PRIVATE_KEY
verse.exporters.langfuse()

OTEL

from verse_sdk import OtelConfig

verse.exporters.otel(
    OtelConfig(host="http://localhost:4318")
)

Verse

from verse_sdk import VerseConfig

verse.exporters.verse(
    VerseConfig(
        api_key="your-api-key",
        host="http://localhost:4318",
        project_id="your-project-id"
    )
)

# Or use environment variables: VERSE_API_KEY, VERSE_HOST, VERSE_PROJECT_ID
verse.exporters.verse()

Scope Filtering

Route traces to specific exporters by scope:

# Configure exporters with scopes
verse.init(
    app_name="my-app",
    exporters=[
        verse.exporters.console({"scopes": ["agent-a"]}),
        verse.exporters.langfuse({"scopes": ["agent-b"]})
    ]
)

# Set scope on traces
@observe(type="trace", scope="agent-a")
def my_function():
    pass

Decorators

Decorators automatically capture inputs, outputs, and errors.

Available Decorators

from verse_sdk import observe

Basic Usage

@observe(type="trace")
async def answer_question(question: str):
    context = await retrieve_context(question)
    return await generate_answer(question, context)

@observe(type="span")
async def retrieve_context(question: str):
    return await db.search(question)

@observe(type="generation")
async def generate_answer(question: str, context: str):
    return await llm.complete(f"Context: {context}\nQ: {question}")

@observe(type="tool")
def search_database(query: str):
    return db.search(query)

Customization

# Custom name
@observe(name="custom_name")
def my_function():
    pass

# Disable input/output capture
@observe(capture_input=False, capture_output=False)
def sensitive_llm_call():
    pass

# Add custom attributes
@observe(level="debug", custom_attr="value")
def detailed_operation():
    pass

Accessing Current Context

from verse_sdk import get_current_trace_context, get_current_span_context

@observe()
def workflow():
    trace_ctx = get_current_trace_context()
    trace_ctx.user("user-123").session("session-456")
    return process_data()

@observe()
def process_data():
    span_ctx = get_current_span_context()
    span_ctx.level("info").metadata({"step": "processing"})
    return "result"

Context Managers

Context managers provide fine-grained control over when attributes are set.

Basic Usage

def process_request(user_id: str, query: str):
    with verse.trace("process_request") as trace:
        trace.input({"user_id": user_id, "query": query})
        trace.session(user_id).user(user_id)

        with verse.span("validate") as span:
            span.input(query).level("debug")
            is_valid = validate(query)
            span.output(is_valid)

        if is_valid:
            with verse.generation("llm_call") as gen:
                gen.model("gpt-4").vendor("openai").input(query)
                response = llm.complete(query)
                gen.output(response).usage({
                    "input_tokens": 150,
                    "output_tokens": 50,
                    "total_tokens": 200
                })

        trace.output(response)
        return response

Grouped Context Managers (Python 3.10+)

with (
    verse.trace("my_trace", session_id="user-123") as trace,
    verse.span("my_span", level="info") as span,
):
    result = process()
    span.output(result)
    trace.output(result)

Setting Attributes

# Option 1: During initialization
with verse.trace(name="my_trace", session_id="user-123", scope="agent-a"):
    pass

# Option 2: After initialization (chainable)
with verse.trace("my_trace") as trace:
    trace.session("user-123").scope("agent-a").tags(["production"])

Context Methods

Common Methods (All Contexts)

  • .input(data) - Set input
  • .output(data) - Set output
  • .metadata(dict) - Add metadata
  • .error(exception) - Record error
  • .score(dict) - Add evaluation score
  • .event(name, level, **attrs) - Add event
  • .set_attributes(**kwargs) - Set custom attributes

TraceContext

  • .session(session_id) - Set session ID
  • .user(user_id) - Set user ID
  • .scope(scope) - Set scope for filtering
  • .tags(list) - Add tags

SpanContext

  • .level(level) - Set log level ("info", "debug", "warning")
  • .operation(op) - Set operation type ("tool", "db.query")
  • .status_message(message) - Set status message

GenerationContext

Inherits SpanContext methods, plus:

  • .model(model_name) - Set model identifier
  • .vendor(vendor) - Set model vendor
  • .usage(dict) - Set token usage ({"input_tokens": 150, "output_tokens": 50, "total_tokens": 200})
  • .messages(list) - Set message history

Integrations

Enable auto-instrumentation by setting the vendor parameter:

Pydantic AI

verse.init(app_name="my-app", vendor="pydantic_ai", exporters=[...])

agent = Agent("openai:gpt-4")
result = await agent.run("query")  # Automatically traced

LiteLLM

verse.init(app_name="my-app", vendor="litellm", exporters=[...])

from litellm import completion
response = completion(model="gpt-4", messages=[...])  # Automatically traced

LangChain

verse.init(app_name="my-app", vendor="langchain", exporters=[...])
# Your LangChain code is automatically traced

API Reference

Decorators

@observe(name=None, type=str capture_input=True, capture_output=True, capture_metadata=True, observation_type=str, **attrs)

Creates an observation.

Context Managers

verse.trace(name, session_id=None, user_id=None, scope=None, tags=None, metadata=None, **attrs)

Returns TraceContext

verse.span(name, input=None, output=None, level=None, op=None, status_message=None, metadata=None, **attrs)

Returns SpanContext

verse.generation(name, model=None, vendor=None, input=None, output=None, messages=None, usage=None, **attrs)

Returns GenerationContext

Helper Functions

get_current_trace_context() -> TraceContext

Get the current trace context from active span. Raises ValueError if unavailable.

get_current_span_context() -> SpanContext

Get the current span context from active span. Raises ValueError if unavailable.

get_current_generation_context() -> GenerationContext

Get the current generation context from active span. Raises ValueError if unavailable.

Data Formats

Usage:

{"input_tokens": 150, "output_tokens": 50, "total_tokens": 200}

Score:

{"name": "quality", "value": 0.95, "comment": "Excellent"}

Best Practices

  1. Use decorators by default - Cleaner code with automatic input/output capture
  2. Use context managers for fine-grained control - When you need dynamic names or conditional logic
  3. Combine both approaches - Decorators for functions, context managers within functions
  4. Choose appropriate observation types:
    • Trace: Top-level workflows
    • Span: Sub-operations, processing steps
    • Generation: LLM API calls
    • Tool: Function/tool calls in agent systems
  5. Enable auto-instrumentation - Set vendor parameter for supported frameworks
  6. Use scope filtering - Route traces to different exporters by scope

Shutdown

verse.shutdown()  # Flush all traces before exit

Support

For issues and questions, please open an issue on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

verse_sdk-0.2.0.tar.gz (52.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

verse_sdk-0.2.0-py3-none-any.whl (40.1 kB view details)

Uploaded Python 3

File details

Details for the file verse_sdk-0.2.0.tar.gz.

File metadata

  • Download URL: verse_sdk-0.2.0.tar.gz
  • Upload date:
  • Size: 52.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for verse_sdk-0.2.0.tar.gz
Algorithm Hash digest
SHA256 d79efd7857119a5a9812019c7a545a1673bf7232ffdb49fbd8e3509ae9a4eb3e
MD5 84e7bf0a9a02fde3b70c90a7b124ddc5
BLAKE2b-256 00a8bcecc1d83a2fce00fa9c5f9ff3b004eab084b1e94d000aa1932d8531ea5c

See more details on using hashes here.

File details

Details for the file verse_sdk-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: verse_sdk-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 40.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for verse_sdk-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 588b987ba61ff656100dd9194375279df3d09286156d9a87b8650a8be3022d42
MD5 815d5d15de3b01eeaee50d6e036200a8
BLAKE2b-256 b026d30bf73c62ee4c2d5d1d84128248a759e592e642600cc666a34207e809c2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page