Skip to main content

Verse Python SDK

Project description

Verse Python SDK

A Python SDK for observability and tracing in AI applications.

Quick Start

from verse_sdk import verse

# Initialize the SDK
verse.init(
    app_name="my-app",
    environment="development",
    exporters=[verse.exporters.console()],
    vendor="pydantic_ai",
    version="1.0.0"
)

# Use context managers (or decorators) for tracing
def my_function():
    with (
        verse.trace() as trace,
        verse.span() as span,
        verse.generation() as generation,
    ):
        # Your code here
        pass

Initialization

Argument Type Required Description
app_name str Yes Identifies your observability project
environment str No Environment classification (dev, prod, etc.)
exporters list[Exporter] Yes OpenTelemetry data exporters
vendor str Yes Enables auto-instrumentation for LLM clients
version str No Application version

Exporters

On any exporter, you can filter outbound data using scopes:

console_for_agent_a = verse.exporters.console({"scopes": ["agent-a"]})
console_for_agent_b = verse.exporters.console({"scopes": ["agent-b"]})

While not all exporters use config otherwise, scopes is a globally accepted attribute.

Console

Print observations to terminal:

verse.exporters.console()

Langfuse

Push trace data to Langfuse:

# With explicit configuration
verse.exporters.langfuse({
    "host": "https://cloud.langfuse.com",
    "private_key": "your-private-key",
    "public_key": "your-public-key",
    "region": "us-east-1"
})

# Or use environment variables
verse.exporters.langfuse()

Context Managers

Automatically track spans and manage observability scope:

def my_function():
    with (
        verse.trace() as trace,
        verse.span() as span,
        verse.generation() as generation,
    ):
        # Context managers accept any argument from their respective Context model
        trace.session("user-123").scope("agent-a")
        span.input("processing data").level("info")
        generation.model("gpt-4").vendor("openai")

def my_function_2():
    with (
        verse.trace(session="user-123", scope="agent-a"),
        verse.span(input="processing_data", level="info"),
        verse.generation(model="gpt-4", vendor="openai"),
    ):
        pass

Supports attributes on init or collected afterwards using each observation model's own API.

Decorators

Instrument functions with minimal code changes:

@observe_trace()
@observe_span() # or just @observe()
@observe_tool()
@observe_generation()
def my_function():
    pass

Context Models

All contexts inherit these methods:

  • error(exception) - Record an error
  • score(score) - Add evaluation score
  • metadata(data) - Add metadata
  • set_attributes(**kwargs) - Set custom attributes

TraceContext

Top-level operation context for tracing complete workflows.

Methods:

  • input(data) - Set trace input
  • output(data) - Set trace output
  • session(session_id) - Set session identifier
  • scope(scope) - Set trace scope
  • user(user_id) - Set user identifier
  • tags(tags) - Add trace tags
  • update(**kwargs) - Update with additional attributes

Example:

with verse.trace() as trace:
    trace.session("user-123").scope("agent-a").input("Hello world")

SpanContext

Regular span context for sub-operations.

Methods:

  • input(data) - Set span input
  • output(data) - Set span output
  • level(level) - Set observation level
  • status_message(message) - Set status message
  • event(name, level="info", **attrs) - Add event with metadata

Example:

with verse.span() as span:
    span.input("processing").level("info").event("step_completed", level="debug")

GenerationContext

Specialized context for LLM operations (inherits from SpanContext).

Methods:

  • input(content) - Set generation prompt
  • output(content) - Set generation output
  • model(model_name) - Set model used
  • vendor(vendor) - Set model vendor
  • usage(usage) - Set token usage metrics

Example:

with verse.generation() as gen:
    gen.model("gpt-4").vendor("openai").input("Hello").output("Hi there!")

Integrations

LiteLLM

Auto-instruments LLM calls when vendor="litellm":

Supported:

  • Both completition and acompletion functions

Pydantic AI

Auto-instruments LLM calls when vendor="pydantic_ai":

Supported:

  • Agent.run() calls
  • Text-based prompts and instructions

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

verse_sdk-0.1.9.tar.gz (44.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

verse_sdk-0.1.9-py3-none-any.whl (31.8 kB view details)

Uploaded Python 3

File details

Details for the file verse_sdk-0.1.9.tar.gz.

File metadata

  • Download URL: verse_sdk-0.1.9.tar.gz
  • Upload date:
  • Size: 44.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for verse_sdk-0.1.9.tar.gz
Algorithm Hash digest
SHA256 efc6bfee1bad512384199ed2a8d4a2b4709e8129730b6880d010c25432a22933
MD5 3baec6d9e2f2ddbbe2e80ba88ff97184
BLAKE2b-256 8b04cfbe0503beae0b65d7d64daa2b596553553e827ac5c000d856ad7670cf3b

See more details on using hashes here.

File details

Details for the file verse_sdk-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: verse_sdk-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 31.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for verse_sdk-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 855984ce35cad2b0dcabc6cae2e5cf6158b061d6246965f05ad21b3b964afa65
MD5 4d4fa5425e1fde5df602098d93994fa2
BLAKE2b-256 96673b540fe4845a59d5c6eee116159d9eb719e7ea45ff7b65bfaff45597c562

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page