Skip to main content

OpenInference instrumentation for Codex CLI/SDK telemetry

Project description

openinference-instrumentation-codex

OpenInference instrumentation for Codex event streams, with a lightweight OTel proxy for process-based tracing.

Repo Boundary

Repo Role Contract with this repo
openinference-instrumentation-codex Instrumentation library and OTel proxy Owns event adapters, span building, and the codex-otel-proxy CLI
External callers Orchestration layer Run Codex and pass event streams into the proxy
Trace backends OTLP collectors and sinks Receive OTLP traces emitted by this package

How It Works

flowchart LR
    source["Codex CLI JSONL or app-server events"] --> adapter["event adapter"]
    adapter --> builder["CodexTraceBuilder"]
    builder --> spans["codex.session tree"]
    spans --> exporter["OTLP exporter"]

See docs/DIAGRAM.md for the boundary view.

Repo Map

Path Purpose
src/openinference/instrumentation/codex/ Library source and OTel proxy
tests/ Unit, integration, and e2e coverage
examples/ Minimal usage examples
scripts/ Helper scripts
docs/ Human docs

Install

Use case Command
Base package pip install openinference-instrumentation-codex
Proxy with OTLP exporter support pip install "openinference-instrumentation-codex[otlp]"
Local development uv sync --extra test

Public Surfaces

Surface Purpose
codex-otel-proxy --transport cli-jsonl Observe codex exec --json JSONL and build spans
codex-otel-proxy --transport app-server Observe app-server JSON notifications and build spans
CodexCliJsonlAdapter Normalize CLI JSONL events
CodexAppServerEventAdapter Normalize app-server notifications
CodexTraceBuilder Build the codex.session tree
CodexInstrumentor Fallback SDK instrumentation

Config

Env var Default Meaning
OPENINFERENCE_CODEX_ENABLED true Enable instrumentation
OPENINFERENCE_CODEX_CAPTURE_INPUTS false Capture prompt input content
OPENINFERENCE_CODEX_CAPTURE_OUTPUTS false Capture model output content
OPENINFERENCE_CODEX_CAPTURE_TOOL_OUTPUTS false Capture tool result content
OPENINFERENCE_CODEX_CAPTURE_RAW_RESPONSE_EVENTS false Capture raw app-server response events
OPENINFERENCE_CODEX_REDACT_INPUTS true Redact sensitive patterns
OPENINFERENCE_CODEX_MAX_ATTRIBUTE_LENGTH 4096 Truncate serialized attributes
OPENINFERENCE_CODEX_PRESERVE_ORIGINAL_ATTRIBUTES true Keep source attributes under codex.original.*
OPENINFERENCE_CODEX_PROVIDER unset Map the observed provider onto llm.provider
OPENINFERENCE_CODEX_DEBUG false Enable debug behavior in instrumentation config

The proxy keeps trace export opt-in. Set OTEL_TRACES_EXPORTER=otlp plus the standard OTLP endpoint env when you want exported traces. Each observed Codex stream produces one codex.session root span; codex.turn and item/tool spans hang below it, and callers can attach job identity with repeated --root-attribute KEY=VALUE flags.

Development

Command Purpose
uv run ruff check . Lint the repo
uv run pytest Run tests
docker compose -f docker-compose.phoenix.yml up --build --abort-on-container-exit --exit-code-from codex-live-smoke Run the Phoenix smoke test

Caller workspaces should treat this repo as a black box and consume released package versions when they need a new proxy revision.

Releases

Item Value
Strategy release-please simple
Source of truth version.txt
Extra release files pyproject.toml, _version.py
Merge style squash merge with Conventional Commit PR titles

Project Docs

Doc Purpose
docs/DIAGRAM.md System diagram
docs/CONTRIBUTING.md Contribution flow
docs/SECURITY.md Security scope and reporting
docs/CODE_OF_CONDUCT.md Collaboration standards
docs/CONTACT.md Private contact path
AGENTS.md Repo-specific agent guidance

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_codex-0.2.1.tar.gz (65.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_codex-0.2.1.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_codex-0.2.1.tar.gz
Algorithm Hash digest
SHA256 68636755c5dcbed08aee75563be23b3710183a0d2d45a9c585ea3def679e4d1f
MD5 eadf3cafe0c506b54583389604bd9f25
BLAKE2b-256 af98689cc12d82ae533c3ad24caad083494c86eaa8d55c1aace2cf9f8adcbca2

See more details on using hashes here.

Provenance

The following attestation bundles were made for openinference_instrumentation_codex-0.2.1.tar.gz:

Publisher: publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openinference_instrumentation_codex-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_codex-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 019ebf32625c9f6c43ad6865b0ee3e3e1459c5964a8e83eba7802a2c6dc51543
MD5 6900df20a12a4ec03ade3297d678dca4
BLAKE2b-256 a83b506e89f1ded8de69c9844f50434248daaff2351751d90a3d330bf053b44d

See more details on using hashes here.

Provenance

The following attestation bundles were made for openinference_instrumentation_codex-0.2.1-py3-none-any.whl:

Publisher: publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page