SignalVault Python SDK — AI audit logs and guardrails for OpenAI and Anthropic applications
Project description
signalvault
AI audit logs and guardrails for your OpenAI and Anthropic Python applications.
Installation
# OpenAI only
pip install signalvault openai
# Anthropic only
pip install signalvault[anthropic]
# Both
pip install signalvault openai signalvault[anthropic]
Quick Start — OpenAI (sync)
import os
from signalvault import SignalVaultClient
client = SignalVaultClient(
api_key="sk_live_your_signalvault_key",
openai_api_key=os.environ["OPENAI_API_KEY"],
base_url="https://api.signalvault.io",
environment="production",
)
# Use exactly like OpenAI SDK
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
Quick Start — OpenAI (async, FastAPI / async Django)
import os
from signalvault import AsyncSignalVaultClient
client = AsyncSignalVaultClient(
api_key="sk_live_your_signalvault_key",
openai_api_key=os.environ["OPENAI_API_KEY"],
base_url="https://api.signalvault.io",
)
response = await client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
Quick Start — Anthropic
import os
from signalvault import AnthropicSignalVaultClient
client = AnthropicSignalVaultClient(
api_key="sk_live_your_signalvault_key",
anthropic_api_key=os.environ["ANTHROPIC_API_KEY"],
base_url="https://api.signalvault.io",
)
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Hello!"}],
max_tokens=1024,
)
print(response.content[0].text)
Streaming
Streaming is fully supported for all clients and providers:
# OpenAI streaming (sync)
stream = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Write a poem"}],
stream=True,
)
for chunk in stream:
print(chunk.choices[0].delta.content or "", end="", flush=True)
# OpenAI streaming (async)
stream = await async_client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Write a poem"}],
stream=True,
)
async for chunk in stream:
print(chunk.choices[0].delta.content or "", end="", flush=True)
# Anthropic streaming
stream = anthropic_client.messages.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Write a poem"}],
max_tokens=1024,
stream=True,
)
for event in stream:
if event.type == "content_block_delta":
print(event.delta.text or "", end="", flush=True)
Metadata
Attach contextual metadata to every event for user attribution, analytics, and audit trails:
# Set defaults at client level
client = SignalVaultClient(
api_key="sk_live_...",
openai_api_key=os.environ["OPENAI_API_KEY"],
metadata={"workspace_id": "ws_abc", "env": "production"},
)
# Override per-call
response = client.chat.completions.create(
model="gpt-4",
messages=[...],
metadata={"user_id": "u_123", "feature": "support-chat"},
)
Timeout Configuration
The pre-flight guardrail check is in your request's critical path. SignalVault uses a short timeout and fails open — your request always goes through even if the SignalVault API is unreachable:
client = SignalVaultClient(
api_key="sk_live_...",
openai_api_key=os.environ["OPENAI_API_KEY"],
preflight_timeout=2.0, # seconds — pre-flight check (fails open). Default: 2.0
timeout=30.0, # seconds — background/post-flight calls. Default: 30.0
)
Mirror Mode
In mirror mode, requests go directly to the AI provider first and SignalVault audits them asynchronously — no latency added, never blocks:
client = SignalVaultClient(
api_key="sk_live_...",
openai_api_key=os.environ["OPENAI_API_KEY"],
mirror_mode=True,
)
Features
- Automatic Logging — Every request and response is recorded
- Pre-flight Guardrails — Block or redact requests before they reach the AI provider
- PII Detection — Detect emails, phone numbers, SSNs
- Secret Detection — Block API keys and tokens
- Token Limits — Enforce cost controls
- Model Allowlists — Restrict which models can be used
- Streaming — Full streaming support for OpenAI and Anthropic
- Async Support —
AsyncSignalVaultClientandAsyncAnthropicSignalVaultClientfor async codebases - Mirror Mode — Observe without blocking
- Metadata — Tag every event with user_id, feature, workspace_id, etc.
- Multi-provider — OpenAI and Anthropic/Claude support
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file signalvault-0.3.0.tar.gz.
File metadata
- Download URL: signalvault-0.3.0.tar.gz
- Upload date:
- Size: 12.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c7a2c41578718553dd9ca636d882e675bd4646b95314d3e805f327b00f92fa23
|
|
| MD5 |
fc2e12ff88fcdf8d1217eea454a7b811
|
|
| BLAKE2b-256 |
b529c3a90afa2fe177845907fb3d3cfe8eb2c124aa11af7032dfb76e7928007d
|
File details
Details for the file signalvault-0.3.0-py3-none-any.whl.
File metadata
- Download URL: signalvault-0.3.0-py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
38e16fb99fa8b39d58e963a6f2456f7eb96f4f656bf0a91bb720aed71e4d433b
|
|
| MD5 |
ec7d193240e4526f0852d73088f6fa1d
|
|
| BLAKE2b-256 |
edefa6a00cbd28f41e8af9a9f41e7a42e4bf1cc2d01e5fbb0d6875366ffcdeb1
|