OpenInference instrumentation for Codex CLI/SDK telemetry
Project description
openinference-instrumentation-codex
OpenInference instrumentation for Codex execution, centered on Codex event streams rather than shallow method wrappers.
Table of contents
- What it does
- Install
- Quickstart (CLI JSONL)
- Quickstart (app-server)
- Quickstart (SDK fallback)
- Span model
- Config
- Phoenix flow
- Collector forwarding
- Limits
- Dev
What it does
The primary path is event-native:
CodexTraceBuildercreates one logicalcodex.sessionroot span, turn spans, and item spans.CodexCliJsonlAdapternormalizescodex exec --jsonJSONL events.CodexAppServerEventAdapternormalizes richer app-server notifications.
Fallback-only paths remain available when richer events are not exposed:
CodexInstrumentorwraps opaque SDK calls such ascodex.Client.run,create_task, and instance-boundresponses.create.CodexCliInstrumentortraces outersubprocess.run(...)calls for Codex commands.
Fallback spans do not fabricate a session tree or fake LLM/tool boundaries.
Install
pip install openinference-instrumentation-codex
For the public CLI runner plus OTLP export support:
pip install "openinference-instrumentation-codex[otlp]"
openinference-codex exec -- "Summarize this repository"
For development/testing:
uv sync --extra test
Quickstart (CLI JSONL)
For process-based integrations, prefer the public runner:
openinference-codex exec -- "Summarize this repository"
It launches codex exec --json, builds the event-native session tree, honors standard
OTel trace exporter/resource environment variables, and propagates TRACEPARENT /
TRACESTATE when present.
import subprocess
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.codex import CodexCliJsonlAdapter, CodexTraceBuilder
provider = TracerProvider()
provider.add_span_processor(
BatchSpanProcessor(OTLPSpanExporter(endpoint="http://localhost:6006/v1/traces"))
)
trace.set_tracer_provider(provider)
builder = CodexTraceBuilder(tracer_provider=provider)
adapter = CodexCliJsonlAdapter(builder)
proc = subprocess.Popen(
["codex", "exec", "--json", "Summarize this repository"],
stdout=subprocess.PIPE,
text=True,
)
assert proc.stdout is not None
for line in proc.stdout:
adapter.observe_line(line)
proc.wait()
builder.finish()
Quickstart (app-server)
from openinference.instrumentation.codex import CodexAppServerEventAdapter, CodexTraceBuilder
builder = CodexTraceBuilder()
adapter = CodexAppServerEventAdapter(builder)
# Feed JSON-RPC notifications from Codex app-server as they arrive.
adapter.observe_notification(notification)
builder.finish()
Quickstart (SDK fallback)
from openinference.instrumentation.codex import CodexInstrumentor
# Use only when native SDK events/tracing are not available.
CodexInstrumentor().instrument()
Span model
| Surface | Root span | Child spans |
|---|---|---|
| Event-native CLI/app-server | codex.session (AGENT) |
codex.turn (CHAIN), tool/item spans |
| SDK fallback | none synthesized | opaque codex.Client.* fallback spans |
| CLI fallback | none synthesized | outer codex.cli.run process span |
Current item mappings include:
command_execution/commandExecution->codex.tool.command_executionfile_change/fileChange->codex.file_changemcp_tool_call/mcpToolCall->codex.tool.mcpdynamicToolCall->codex.tool.dynamiccollabToolCall->codex.agent.subagentweb_search/webSearch->codex.tool.web_searchimageView->codex.tool.image_viewcontextCompaction->codex.context_compaction
Mapped attributes use OpenInference/OpenTelemetry conventions where available, including:
openinference.span.kindsession.idgraph.node.id,graph.node.parent_idllm.model_name,llm.providerllm.token_count.prompt,llm.token_count.completion,llm.token_count.totaltool.id,tool.name,tool.parameters
Optional content fields:
input.value(whencapture_inputs=True)output.value(whencapture_outputs=True)
Config
Safe defaults are privacy-first.
| Env var | Default | Meaning |
|---|---|---|
OPENINFERENCE_CODEX_ENABLED |
true |
Enable/disable instrumentation |
OPENINFERENCE_CODEX_CAPTURE_INPUTS |
false |
Capture input content |
OPENINFERENCE_CODEX_CAPTURE_OUTPUTS |
false |
Capture output content |
OPENINFERENCE_CODEX_CAPTURE_TOOL_OUTPUTS |
false |
Capture tool result content |
OPENINFERENCE_CODEX_REDACT_INPUTS |
true |
Redact sensitive patterns |
OPENINFERENCE_CODEX_MAX_ATTRIBUTE_LENGTH |
4096 |
Truncate serialized attributes |
OPENINFERENCE_CODEX_PRESERVE_ORIGINAL_ATTRIBUTES |
true |
Preserve source attrs as codex.original.* |
[!NOTE] When capture flags are off, corresponding
codex.original.input/codex.original.output/codex.original.invocation_parameters/ tool-output originals are also suppressed.
Phoenix flow
flowchart LR
A[Codex CLI/app-server events] --> B[openinference-instrumentation-codex]
B --> C[OTEL SDK + OTLP Exporter]
C --> D[Phoenix /v1/traces]
For Phoenix project routing, set openinference.project.name as a resource attribute on the tracer provider. See examples/phoenix_example.py.
When the event-native adapter is the authoritative trace source for a Codex run, avoid also exporting an unrelated native Codex root trace unless you have verified parent-context behavior. codexops disables the nested Codex trace exporter and propagates TRACEPARENT into child Codex processes so the exported tree stays singular and inspectable.
Collector forwarding
Use this when Codex CLI emits OTLP to a local collector, then forward downstream.
receivers:
otlp:
protocols:
http:
grpc:
exporters:
otlp/downstream:
endpoint: downstream-collector:4317
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
exporters: [otlp/downstream]
Limits
- The SDK fallback can only describe opaque public method calls; it cannot reconstruct tool-level work.
CodexCliInstrumentoronly observes process boundaries. UseCodexCliJsonlAdapterwithcodex exec --jsonfor comprehensive traces.- LLM spans are intentionally not fabricated unless a source surface exposes real model-call boundaries.
- Current public Codex events do not expose first-class skill/plugin invocation items. They can only appear indirectly through visible dynamic/tool events today.
- This library does not configure global exporters/providers for you.
Dev
uv run ruff check .
uv run pytest
docker compose -f docker-compose.phoenix.yml up --build --abort-on-container-exit --exit-code-from codex-live-smoke
Releases
This repo uses release-please with the simple strategy. version.txt is the release
source of truth, pyproject.toml and _version.py are synced as release extra files, and
the initial release is pinned to 0.1.0.
Pull request titles must follow Conventional Commits and PRs should be squash-merged so the
title becomes the release commit consumed by release-please. While the project remains
pre-1.0.0, breaking changes are configured to produce minor releases rather than majors,
so releases stay on the minor/patch track until that policy changes.
Merging the generated Release PR creates the GitHub tag and release. The workflow falls
back to the built-in GITHUB_TOKEN when RELEASE_PLEASE_TOKEN is not configured, but set
RELEASE_PLEASE_TOKEN to a PAT or GitHub App token so Release Please-created tags can
trigger follow-on workflows. That secret is required for automatic package publication
after the Release PR is merged. If you rely on the fallback token, repository settings
must allow GitHub Actions to create pull requests. On v* tag pushes,
publish-package.yaml builds the package and publishes it through PyPI Trusted Publishing
from the pypi environment.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openinference_instrumentation_codex-0.1.0.tar.gz.
File metadata
- Download URL: openinference_instrumentation_codex-0.1.0.tar.gz
- Upload date:
- Size: 58.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
041d55f82a52497c8eb004fcbd90a6073665f4fbddb4c8c55ba9e08ce051df36
|
|
| MD5 |
d272ace8833cf576772c293aacf4d9cd
|
|
| BLAKE2b-256 |
db37800748f51fc86d1fe8d059138345b53a8189f1f5669982154ab5f8ca12c9
|
Provenance
The following attestation bundles were made for openinference_instrumentation_codex-0.1.0.tar.gz:
Publisher:
publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openinference_instrumentation_codex-0.1.0.tar.gz -
Subject digest:
041d55f82a52497c8eb004fcbd90a6073665f4fbddb4c8c55ba9e08ce051df36 - Sigstore transparency entry: 1395062074
- Sigstore integration time:
-
Permalink:
justinthelaw-oai/openinference-instrumentation-codex@0581bcac221c75400fcbccef5a859e087f137c9f -
Branch / Tag:
refs/heads/main - Owner: https://github.com/justinthelaw-oai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-package.yaml@0581bcac221c75400fcbccef5a859e087f137c9f -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file openinference_instrumentation_codex-0.1.0-py3-none-any.whl.
File metadata
- Download URL: openinference_instrumentation_codex-0.1.0-py3-none-any.whl
- Upload date:
- Size: 26.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6065df1576c5a354c7bb417fcdd968a1a331ab307526a6b9e81b4404106d25fb
|
|
| MD5 |
c6fade885b58c60f4c21252e1442e62a
|
|
| BLAKE2b-256 |
1180db87c809b5f8d8fe79ce70f6b522aa5d83db0ec242fb15b6dd2607d8b07c
|
Provenance
The following attestation bundles were made for openinference_instrumentation_codex-0.1.0-py3-none-any.whl:
Publisher:
publish-package.yaml on justinthelaw-oai/openinference-instrumentation-codex
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openinference_instrumentation_codex-0.1.0-py3-none-any.whl -
Subject digest:
6065df1576c5a354c7bb417fcdd968a1a331ab307526a6b9e81b4404106d25fb - Sigstore transparency entry: 1395062087
- Sigstore integration time:
-
Permalink:
justinthelaw-oai/openinference-instrumentation-codex@0581bcac221c75400fcbccef5a859e087f137c9f -
Branch / Tag:
refs/heads/main - Owner: https://github.com/justinthelaw-oai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-package.yaml@0581bcac221c75400fcbccef5a859e087f137c9f -
Trigger Event:
workflow_dispatch
-
Statement type: