Official Python SDK for AITracer — tracing, verification, governance, cost intelligence, and integrations
Project description
AITracer Python SDK
This repository is a monorepo (github.com/no1rstack/aitracer-sdk): the official Python package (this document), the TypeScript SDK at packages/sdk (npm install @aitracer/sdk), and starter templates under starters/.
Official Python entrypoint for AITracer — tracing, verification, governance signals, cost intelligence, and first-party integrations (OpenAI Agents, OTLP/JSON, Guardrails).
Install from PyPI (package name aitracer-sdk, import as aitracer):
pip install aitracer-sdk
Current version: 0.3.0
Configure
Use an workspace API key (akt_…) and your app base URL (same origin as the AITracer web app).
from aitracer import AITracerClient, configure
configure(api_key="akt_xxx", base_url="https://your-app.example.com")
# Or construct a client explicitly:
client = AITracerClient(api_key="akt_xxx", base_url="https://your-app.example.com")
The client uses retries, timeouts, and Authorization: Bearer on all requests.
Basic tracing
Native ingest maps to POST /api/traces.
from aitracer import trace, configure
configure(api_key="akt_xxx", base_url="https://your-app.example.com")
trace.record(
workflow="customer-support-agent",
model="gpt-4.1",
input_data={"query": "Status?"},
output_data={"answer": "Shipped."},
metrics={"promptTokens": 50, "completionTokens": 20, "latencyMs": 400},
)
Async tracing
For FastAPI, async agents, and event loops, install httpx:
pip install 'aitracer-sdk[async]'
import asyncio
from aitracer import AsyncAITracerClient, configure_async, trace
async def main():
configure_async(api_key="akt_xxx", base_url="https://your-app.example.com")
await trace.arecord(
workflow="async-job",
model="gpt-4.1",
input_data={"q": "hello"},
output_data={"a": "world"},
)
asyncio.run(main())
AsyncAITracerClient uses async retries with exponential backoff on timeouts, connection errors, and 429 / 502 / 503 / 504.
Batch ingestion
Sync (thread pool, preserves order):
from aitracer import trace, configure
configure(api_key="akt_xxx", base_url="https://your-app.example.com")
trace.batch_record(
[
{"workflow": "bulk-1", "model": "gpt-4o-mini", "input_data": {"i": 1}},
{"workflow": "bulk-2", "model": "gpt-4o-mini", "input_data": {"i": 2}},
],
max_workers=8,
)
Async (bounded concurrency):
await trace.batch_arecord(
[{"workflow": "a", "model": "gpt-4o"}, {"workflow": "b", "model": "gpt-4o"}],
concurrency=8,
)
CLI
After install, the aitracer command is available:
aitracer init # print starter AITRACER_* env vars
export AITRACER_API_KEY=akt_...
export AITRACER_BASE_URL=https://your-app.example.com
aitracer test-ingest # POST a minimal trace
aitracer verify <trace_id> # best-effort lookup (depends on auth mode)
HTTP errors & observability
AITracerHTTPError includes request_id, correlation_id, url, attempts_made, and a response_body excerpt. Use exc.diagnostics() for structured logging.
Verification
Many deployments embed verification records on the trace ingest response (there is often no standalone public verification URL).
from aitracer import verify
records = verify.records_from_response(ingest_response)
digest = verify.hash("your-trace-id")
Governance (Guardrails / policy)
Normalized guardrail and policy outcomes → POST /api/integrations/openai-guardrails.
from aitracer import governance, configure
configure(api_key="akt_xxx", base_url="https://your-app.example.com")
governance.report_policy_event(
guardrail_name="pii",
trigger_type="pii_detected",
trigger_source="user_input",
action_taken="blocked",
trace_id="optional-public-trace-slug",
)
Cost intelligence
There is often no standalone cost HTTP API; token and spend fields are carried on trace / execution payloads. Helpers aggregate what you already ingested or listed:
from aitracer import cost
summary = cost.from_ingest_response(ingest_response)
rollup = cost.aggregate(list_of_trace_dicts)
spend_by_model = cost.get_model_spend(list_of_trace_dicts)
OpenAI Agents integration
POST /api/integrations/openai-agents
from aitracer.integrations.openai_agents import register_tracing, export_spans
cfg = register_tracing(
base_url="https://your-app.example.com",
api_key="akt_xxx",
)
export_spans({"traceId": "...", "spans": [...]}, config=cfg)
OTLP integration
POST /api/integrations/otlp/v1/traces (JSON encoding).
from aitracer.integrations.otlp import export_resource_spans, collector_exporter_yaml_snippet
from aitracer import AITracerClient
client = AITracerClient(api_key="akt_xxx", base_url="https://your-app.example.com")
export_resource_spans({"resourceSpans": [...]}, client=client)
print(collector_exporter_yaml_snippet(base_url="https://your-app.example.com", api_key="akt_xxx"))
Documentation
- Product docs: https://aitracer.app/docs
- Developer reference (OpenAPI): https://aitracer.app/developers (or
/developerson your deployment)
Examples
Runnable samples live in examples/ (set AITRACER_API_KEY and optional AITRACER_BASE_URL).
Local smoke install
./scripts/smoke-install-test.sh local
Roadmap (toward “production hardening”)
Shipped in 0.3.0: async client, batch ingest, richer HTTP errors, CLI stubs, trace.astream / webhooks placeholders, contrib/ for framework adapters.
Still the main leverage points for enterprise adoption:
| Area | Status |
|---|---|
Async (AsyncAITracerClient, trace.arecord, batch_arecord) |
Implemented ([async] extra) |
Batch (trace.batch_record) |
Implemented |
Streaming (trace.astream) |
Not implemented — needs platform ingest contract |
| Framework wrappers (LangChain, LlamaIndex, CrewAI, FastAPI middleware, …) | contrib/ placeholder only |
| CLI | init, test-ingest, verify |
| Error observability | Request/correlation IDs, URL, attempts, diagnostics() |
| Pydantic models | Optional future extra (validation / IDE ergonomics) |
| Webhooks | handle_verification_webhook placeholder |
| MCP / enterprise policy | Future layers on top of stable client / trace APIs |
License
MIT — see repository for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aitracer_sdk-0.3.0.tar.gz.
File metadata
- Download URL: aitracer_sdk-0.3.0.tar.gz
- Upload date:
- Size: 19.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e059a344e1999791418d456ff6b6c81e6725d03dc434471e01288915e0c4576d
|
|
| MD5 |
5745f2526bd5ce8ba64b36563cc9d198
|
|
| BLAKE2b-256 |
006bf8fd2759ca323f5e174994ef759f30b2d705c07e191a9d61a8ecf2f1bf7f
|
File details
Details for the file aitracer_sdk-0.3.0-py3-none-any.whl.
File metadata
- Download URL: aitracer_sdk-0.3.0-py3-none-any.whl
- Upload date:
- Size: 23.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ddc234ecd39c1cfb185135d1cabbb69ffd7cc6ade26cc7508817b9492b50cd6f
|
|
| MD5 |
f7c8cd012e38ff13f79bd33943eb09be
|
|
| BLAKE2b-256 |
61bcb0b10515bdf3d015f03bb8664ded1b589f3d6dc99e527316e04ab49bee37
|