A Python package for Veris AI tools
Project description
Veris AI Python SDK
A Python package for Veris AI tools with simulation capabilities and FastAPI MCP (Model Context Protocol) integration.
Quick Reference
Purpose: Tool mocking, tracing, and FastAPI MCP integration for AI agent development
Core Components: tool_mock • api_client • observability • agents_wrapper • fastapi_mcp • jaeger_interface
Deep Dive: Module Architecture • Testing Guide • Usage Examples
Source of Truth: Implementation details in src/veris_ai/ source code
Installation
# Base package
uv add veris-ai
# With optional extras
uv add "veris-ai[dev,fastapi,observability,agents]"
Installation Profiles:
dev: Development tools (ruff, pytest, mypy)fastapi: FastAPI MCP integrationobservability: OpenTelemetry tracingagents: OpenAI agents integration
Import Patterns
Semantic Tag: import-patterns
# Core imports (base dependencies only)
from veris_ai import veris, JaegerClient
# Optional features (require extras)
from veris_ai import init_observability, instrument_fastapi_app # Requires observability extras
from veris_ai import Runner, VerisConfig # Requires agents extras
Complete Import Strategies: See examples/README.md for different import approaches, conditional features, and integration patterns.
Configuration
Semantic Tag: environment-config
| Variable | Purpose | Default |
|---|---|---|
VERIS_API_KEY |
API authentication key | None |
VERIS_MOCK_TIMEOUT |
Request timeout (seconds) | 90.0 |
ENV |
Set to "simulation" for mock mode |
Production |
Advanced Configuration (rarely needed):
VERIS_API_URL: Override default API endpoint (defaults to production)
Configuration Details: See src/veris_ai/api_client.py for API configuration and src/veris_ai/tool_mock.py for environment handling logic.
SDK Observability Helpers
The SDK provides optional-safe observability helpers that standardize OpenTelemetry setup and W3C context propagation across services.
from fastapi import FastAPI
from veris_ai import init_observability, instrument_fastapi_app
# Initialize tracing/export early (no-op if dependencies are absent)
init_observability()
app = FastAPI()
# Ensure inbound HTTP requests continue W3C traces
instrument_fastapi_app(app)
Observability Environment
Set these environment variables to enable exporting traces via OTLP (Logfire) and ensure consistent service naming:
| Variable | Example | Notes |
|---|---|---|
OTEL_SERVICE_NAME |
simulation-server |
Should match VERIS_SERVICE_NAME used elsewhere to keep traces aligned |
OTEL_EXPORTER_OTLP_ENDPOINT |
https://logfire-api.pydantic.dev |
OTLP HTTP endpoint |
LOGFIRE_TOKEN |
FILL_IN |
Logfire API token used by the exporter |
OTEL_EXPORTER_OTLP_HEADERS |
'Authorization=FILL_IN' |
Include quotes to preserve the =; often Authorization=Bearer <LOGFIRE_TOKEN> |
Quick setup example:
export OTEL_SERVICE_NAME="simulation-server"
export OTEL_EXPORTER_OTLP_ENDPOINT="https://logfire-api.pydantic.dev"
export LOGFIRE_TOKEN="<your-token>"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=${LOGFIRE_TOKEN}"
Then initialize in code early in your process:
from veris_ai import init_observability, instrument_fastapi_app
init_observability()
app = FastAPI()
instrument_fastapi_app(app)
What this enables:
- Sets global W3C propagator (TraceContext + Baggage)
- Optionally instruments FastAPI, requests, httpx, MCP client if installed
- Includes request hooks to attach outbound
traceparenton HTTP calls for continuity
End-to-end propagation with the simulator:
- The simulator injects W3C headers when connecting to your FastAPI MCP endpoints
- The SDK injects W3C headers on
/api/v2/tool_mockand logging requests back to the simulator - Result: customer agent spans and tool mocks appear under the same distributed trace
Function Mocking
Semantic Tag: tool-mocking
Core Decorators
from veris_ai import veris
# Mock mode: Returns simulated responses in ENV=simulation
@veris.mock()
async def your_function(param1: str, param2: int) -> dict:
"""Function documentation for LLM context."""
return {"result": "actual implementation"}
# Spy mode: Executes function but logs calls/responses
@veris.spy()
async def monitored_function(data: str) -> dict:
return process_data(data)
# Stub mode: Returns fixed value in simulation
@veris.stub(return_value={"status": "success"})
async def get_data() -> dict:
return await fetch_from_api()
Behavior: In simulation mode, decorators intercept calls to mock endpoints. In production, functions execute normally.
Implementation: See src/veris_ai/tool_mock.py for decorator logic and API integration.
OpenAI Agents Integration
Semantic Tag: openai-agents
The SDK provides seamless integration with OpenAI's agents library through the Runner class, which extends OpenAI's Runner to intercept tool calls and route them through Veris's mocking infrastructure.
Installation
# Install with agents support
uv add "veris-ai[agents]"
Basic Usage
from veris_ai import veris, Runner, VerisConfig
from agents import Agent, function_tool
# Define your tools
@function_tool
def calculator(x: int, y: int, operation: str = "add") -> int:
"""Performs arithmetic operations."""
# ... implementation ...
# Create an agent with tools
agent = Agent(
name="Assistant",
model="gpt-4",
tools=[calculator],
instructions="You are a helpful assistant.",
)
# Use Veris Runner instead of OpenAI's Runner
result = await Runner.run(agent, "Calculate 10 + 5")
# Or with configuration
config = VerisConfig(include_tools=["calculator"])
result = await Runner.run(agent, "Calculate 10 + 5", veris_config=config)
Selective Tool Interception
Control which tools are intercepted using VerisConfig:
from veris_ai import Runner, VerisConfig
# Only intercept specific tools
config = VerisConfig(include_tools=["calculator", "search_web"])
result = await Runner.run(agent, "Process this", veris_config=config)
# Or exclude specific tools from interception
config = VerisConfig(exclude_tools=["get_weather"])
result = await Runner.run(agent, "Check weather", veris_config=config)
Advanced Tool Configuration
Fine-tune individual tool behavior using ToolCallOptions:
from veris_ai import Runner, VerisConfig, ResponseExpectation, ToolCallOptions
# Configure specific tool behaviors
config = VerisConfig(
tool_options={
"calculator": ToolCallOptions(
response_expectation=ResponseExpectation.REQUIRED, # Always expect response
cache_response=True, # Cache responses for identical calls
mode="tool" # Use tool mode (default)
),
"search_web": ToolCallOptions(
response_expectation=ResponseExpectation.NONE, # Don't wait for response
cache_response=False,
mode="spy" # Log calls but execute normally
)
}
)
result = await Runner.run(agent, "Calculate and search", veris_config=config)
ToolCallOptions Parameters:
response_expectation: Control response behaviorAUTO(default): Automatically determine based on contextREQUIRED: Always wait for mock responseNONE: Don't wait for response
cache_response: Cache responses for identical tool callsmode: Tool execution mode"tool"(default): Standard tool execution"function": Function mode
Key Features:
- Drop-in replacement: Use
Runnerfrom veris_ai instead of OpenAI's Runner - Extends OpenAI Runner: Inherits all functionality while adding Veris capabilities
- Automatic session management: Integrates with Veris session IDs
- Selective mocking: Include or exclude specific tools from interception
Implementation: See src/veris_ai/agents_wrapper.py for the integration logic and examples/openai_agents_example.py for complete examples.
FastAPI MCP Integration
Semantic Tag: fastapi-mcp
Expose FastAPI endpoints as MCP tools for AI agent consumption using HTTP transport.
from fastapi import FastAPI
from veris_ai import veris
app = FastAPI()
# Enable MCP integration with HTTP transport
veris.set_fastapi_mcp(
fastapi=app,
name="My API Server",
include_operations=["get_users", "create_user"],
exclude_tags=["internal"]
)
# Mount the MCP server with HTTP transport (recommended)
veris.fastapi_mcp.mount_http()
Key Features:
- HTTP Transport: Uses Streamable HTTP protocol for better session management
- Automatic schema conversion: FastAPI OpenAPI → MCP tool definitions
- Session management: Bearer token → session ID mapping
- Filtering: Include/exclude operations and tags
- Authentication: OAuth2 integration
Transport Protocol: The SDK uses HTTP transport (via mount_http()) which implements the MCP Streamable HTTP specification, providing robust connection handling and fixing session routing issues with concurrent connections.
Configuration Reference: See function signature in src/veris_ai/tool_mock.py for all set_fastapi_mcp() parameters.
Utility Functions
Semantic Tag: json-schema-utils
from veris_ai.utils import extract_json_schema
# Schema extraction from types
user_schema = extract_json_schema(User) # Pydantic models
list_schema = extract_json_schema(List[str]) # Generics
Supported Types: Built-in types, generics (List, Dict, Union), Pydantic models, TypedDict, forward references.
Implementation: See src/veris_ai/utils.py for type conversion logic.
Development
Semantic Tag: development-setup
Requirements: Python 3.11+, uv package manager
# Install with dev dependencies
uv add "veris-ai[dev]"
# Quality checks
ruff check --fix . # Lint and format
pytest --cov=veris_ai # Test with coverage
Testing & Architecture: See tests/README.md for test structure, fixtures, and coverage strategies. See src/veris_ai/README.md for module architecture and implementation flows.
Module Architecture
Semantic Tag: module-architecture
Core Modules: tool_mock (mocking), api_client (centralized API), agents_wrapper (OpenAI agents integration), jaeger_interface (trace queries), utils (schema conversion)
Complete Architecture: See src/veris_ai/README.md for module overview, implementation flows, and configuration details.
Jaeger Trace Interface
Semantic Tag: jaeger-query-api
from veris_ai.jaeger_interface import JaegerClient
client = JaegerClient("http://localhost:16686")
traces = client.search(service="veris-agent", tags={"error": "true"})
Complete Guide: See src/veris_ai/jaeger_interface/README.md for API reference, filtering strategies, and architecture details.
License: MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file veris_ai-1.9.0.tar.gz.
File metadata
- Download URL: veris_ai-1.9.0.tar.gz
- Upload date:
- Size: 127.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5f3e42bd7409cf1742461c6717c990c2771d35835ceb7aa53cc6d7362e750323
|
|
| MD5 |
548393554e86333a7e1457e83aef6842
|
|
| BLAKE2b-256 |
b99224f7a4905e4f43afa2afced8110ed4d949ccd670c257f32ad86350e40c46
|
File details
Details for the file veris_ai-1.9.0-py3-none-any.whl.
File metadata
- Download URL: veris_ai-1.9.0-py3-none-any.whl
- Upload date:
- Size: 27.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a897929f0c50e1328dc2bee4880e2dd9c16f679af22af7faf18642d7e161a2f
|
|
| MD5 |
013047ac92920f2721a082e7424a0621
|
|
| BLAKE2b-256 |
90a061096a16488a3a4a2729532584a5fe119d14b65aa2ff884e6360e2384a9b
|