Skip to main content

OpenInference instrumentation for Codex CLI/SDK telemetry

Project description

openinference-instrumentation-codex

OpenInference instrumentation for Codex execution, centered on Codex event streams rather than shallow method wrappers.

Table of contents

What it does

The primary path is event-native:

  • CodexTraceBuilder creates one logical codex.session root span, turn spans, and item spans.
  • CodexCliJsonlAdapter normalizes codex exec --json JSONL events.
  • CodexAppServerEventAdapter normalizes richer app-server notifications.

Fallback-only paths remain available when richer events are not exposed:

  • CodexInstrumentor wraps opaque SDK calls such as codex.Client.run, create_task, and instance-bound responses.create.
  • CodexCliInstrumentor traces outer subprocess.run(...) calls for Codex commands.

Fallback spans do not fabricate a session tree or fake LLM/tool boundaries.

Install

pip install openinference-instrumentation-codex

For the public CLI runner plus OTLP export support:

pip install "openinference-instrumentation-codex[otlp]"
openinference-codex exec -- "Summarize this repository"

For development/testing:

uv sync --extra test

Quickstart (CLI JSONL)

For process-based integrations, prefer the public runner:

openinference-codex exec -- "Summarize this repository"

It launches codex exec --json, builds the event-native session tree, honors standard OTel trace exporter/resource environment variables, and propagates TRACEPARENT / TRACESTATE when present.

import subprocess

from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

from openinference.instrumentation.codex import CodexCliJsonlAdapter, CodexTraceBuilder

provider = TracerProvider()
provider.add_span_processor(
    BatchSpanProcessor(OTLPSpanExporter(endpoint="http://localhost:6006/v1/traces"))
)
trace.set_tracer_provider(provider)

builder = CodexTraceBuilder(tracer_provider=provider)
adapter = CodexCliJsonlAdapter(builder)

proc = subprocess.Popen(
    ["codex", "exec", "--json", "Summarize this repository"],
    stdout=subprocess.PIPE,
    text=True,
)
assert proc.stdout is not None
for line in proc.stdout:
    adapter.observe_line(line)
proc.wait()
builder.finish()

Quickstart (app-server)

from openinference.instrumentation.codex import CodexAppServerEventAdapter, CodexTraceBuilder

builder = CodexTraceBuilder()
adapter = CodexAppServerEventAdapter(builder)

# Feed JSON-RPC notifications from Codex app-server as they arrive.
adapter.observe_notification(notification)
builder.finish()

Quickstart (SDK fallback)

from openinference.instrumentation.codex import CodexInstrumentor

# Use only when native SDK events/tracing are not available.
CodexInstrumentor().instrument()

Span model

Surface Root span Child spans
Event-native CLI/app-server codex.session (AGENT) codex.turn (CHAIN), tool/item spans
SDK fallback none synthesized opaque codex.Client.* fallback spans
CLI fallback none synthesized outer codex.cli.run process span

Current item mappings include:

  • command_execution / commandExecution -> codex.tool.command_execution
  • file_change / fileChange -> codex.file_change
  • mcp_tool_call / mcpToolCall -> codex.tool.mcp
  • dynamicToolCall -> codex.tool.dynamic
  • collabToolCall -> codex.agent.subagent
  • web_search / webSearch -> codex.tool.web_search
  • imageView -> codex.tool.image_view
  • contextCompaction -> codex.context_compaction

Mapped attributes use OpenInference/OpenTelemetry conventions where available, including:

  • openinference.span.kind
  • session.id
  • graph.node.id, graph.node.parent_id
  • llm.model_name, llm.provider
  • llm.token_count.prompt, llm.token_count.completion, llm.token_count.total
  • tool.id, tool.name, tool.parameters

Optional content fields:

  • input.value (when capture_inputs=True)
  • output.value (when capture_outputs=True)

Config

Safe defaults are privacy-first.

Env var Default Meaning
OPENINFERENCE_CODEX_ENABLED true Enable/disable instrumentation
OPENINFERENCE_CODEX_CAPTURE_INPUTS false Capture input content
OPENINFERENCE_CODEX_CAPTURE_OUTPUTS false Capture output content
OPENINFERENCE_CODEX_CAPTURE_TOOL_OUTPUTS false Capture tool result content
OPENINFERENCE_CODEX_REDACT_INPUTS true Redact sensitive patterns
OPENINFERENCE_CODEX_MAX_ATTRIBUTE_LENGTH 4096 Truncate serialized attributes
OPENINFERENCE_CODEX_PRESERVE_ORIGINAL_ATTRIBUTES true Preserve source attrs as codex.original.*

[!NOTE] When capture flags are off, corresponding codex.original.input / codex.original.output / codex.original.invocation_parameters / tool-output originals are also suppressed.

Phoenix flow

flowchart LR
  A[Codex CLI/app-server events] --> B[openinference-instrumentation-codex]
  B --> C[OTEL SDK + OTLP Exporter]
  C --> D[Phoenix /v1/traces]

For Phoenix project routing, set openinference.project.name as a resource attribute on the tracer provider. See examples/phoenix_example.py.

When the event-native adapter is the authoritative trace source for a Codex run, avoid also exporting an unrelated native Codex root trace unless you have verified parent-context behavior. codexops disables the nested Codex trace exporter and propagates TRACEPARENT into child Codex processes so the exported tree stays singular and inspectable.

Collector forwarding

Use this when Codex CLI emits OTLP to a local collector, then forward downstream.

receivers:
  otlp:
    protocols:
      http:
      grpc:

exporters:
  otlp/downstream:
    endpoint: downstream-collector:4317
    tls:
      insecure: true

service:
  pipelines:
    traces:
      receivers: [otlp]
      exporters: [otlp/downstream]

Limits

  • The SDK fallback can only describe opaque public method calls; it cannot reconstruct tool-level work.
  • CodexCliInstrumentor only observes process boundaries. Use CodexCliJsonlAdapter with codex exec --json for comprehensive traces.
  • LLM spans are intentionally not fabricated unless a source surface exposes real model-call boundaries.
  • Current public Codex events do not expose first-class skill/plugin invocation items. They can only appear indirectly through visible dynamic/tool events today.
  • This library does not configure global exporters/providers for you.

Dev

uv run ruff check .
uv run pytest
docker compose -f docker-compose.phoenix.yml up --build --abort-on-container-exit --exit-code-from codex-live-smoke

Releases

This repo uses release-please with the simple strategy. version.txt is the release source of truth, pyproject.toml and _version.py are synced as release extra files, and the initial release is pinned to 0.1.0.

Pull request titles must follow Conventional Commits and PRs should be squash-merged so the title becomes the release commit consumed by release-please. While the project remains pre-1.0.0, breaking changes are configured to produce minor releases rather than majors, so releases stay on the minor/patch track until that policy changes.

Merging the generated Release PR creates the GitHub tag and release. The workflow falls back to the built-in GITHUB_TOKEN when RELEASE_PLEASE_TOKEN is not configured, but set RELEASE_PLEASE_TOKEN to a PAT or GitHub App token so Release Please-created tags can trigger follow-on workflows. That secret is required for automatic package publication after the Release PR is merged. If you rely on the fallback token, repository settings must allow GitHub Actions to create pull requests. On v* tag pushes, publish-package.yaml builds the package and publishes it through PyPI Trusted Publishing from the pypi environment.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_codex-0.1.0.tar.gz (58.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_codex-0.1.0.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_codex-0.1.0.tar.gz
Algorithm Hash digest
SHA256 041d55f82a52497c8eb004fcbd90a6073665f4fbddb4c8c55ba9e08ce051df36
MD5 d272ace8833cf576772c293aacf4d9cd
BLAKE2b-256 db37800748f51fc86d1fe8d059138345b53a8189f1f5669982154ab5f8ca12c9

See more details on using hashes here.

Provenance

The following attestation bundles were made for openinference_instrumentation_codex-0.1.0.tar.gz:

Publisher: publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openinference_instrumentation_codex-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_codex-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6065df1576c5a354c7bb417fcdd968a1a331ab307526a6b9e81b4404106d25fb
MD5 c6fade885b58c60f4c21252e1442e62a
BLAKE2b-256 1180db87c809b5f8d8fe79ce70f6b522aa5d83db0ec242fb15b6dd2607d8b07c

See more details on using hashes here.

Provenance

The following attestation bundles were made for openinference_instrumentation_codex-0.1.0-py3-none-any.whl:

Publisher: publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page