Skip to main content

Codex auth helpers for pydantic-ai OpenAI Responses models.

Project description

codex-auth-helper

codex-auth-helper turns an existing local Codex auth session into either:

  • a pydantic-ai Responses model
  • a LangChain ChatOpenAI model pinned to the OpenAI Responses API

It reads ~/.codex/auth.json, refreshes access tokens when needed, builds Codex-specific OpenAI clients for the Responses endpoint, and returns either a ready-to-use CodexResponsesModel or a LangChain chat model.

What It Does

  • Reads tokens from ~/.codex/auth.json
  • Derives ChatGPT-Account-Id from the auth file or token claims
  • Refreshes expired access tokens with https://auth.openai.com/oauth/token
  • Writes refreshed tokens back to the auth file
  • Builds an OpenAI-compatible client pointed at https://chatgpt.com/backend-api/codex
  • Returns a pydantic-ai responses model that already applies the Codex backend requirements
  • Returns a LangChain ChatOpenAI model configured for the Responses API

The helper enforces two backend-specific behaviors for you:

  • openai_store=False
  • streamed responses even when pydantic-ai calls the non-streamed request() path

What It Does Not Do

  • It does not log you into Codex
  • It does not create ~/.codex/auth.json
  • It does not provide generic Chat Completions wiring
  • It does not replace pydantic-ai; it only provides a model/client factory

Install

uv add codex-auth-helper
pip install codex-auth-helper

For LangChain usage:

uv add "codex-auth-helper[langchain]"
pip install "codex-auth-helper[langchain]"

You also need an existing Codex auth session on the same machine:

~/.codex/auth.json

If you have not logged in yet:

codex login

Quick Start

from codex_auth_helper import create_codex_responses_model
from pydantic_ai import Agent

model = create_codex_responses_model(
    "gpt-5.4",
    instructions="You are a helpful coding assistant.",
)
agent = Agent(model)

result = agent.run_sync("Naber")
print(result.output)

LangChain Quick Start

from codex_auth_helper import create_codex_chat_openai
from langchain.agents import create_agent

graph = create_agent(
    model=create_codex_chat_openai(
        "gpt-5.4",
        instructions="You are a helpful coding assistant.",
    ),
    tools=[],
    name="codex-graph",
)

The LangChain helper returns langchain_openai.ChatOpenAI configured to:

  • use the Codex Responses endpoint
  • reuse local Codex auth state
  • keep use_responses_api=True
  • default to output_version="responses/v1"
  • require instructions= and pass it through to the Responses request

instructions is mandatory for create_codex_chat_openai(...). The helper does not provide an implicit system prompt for the LangChain path; callers must pass the behavior they want explicitly.

The same rule applies to create_codex_responses_model(...) on the Pydantic path. Pass the Codex system behavior to the helper directly instead of relying on a separate agent-level instruction just to seed the model.

Custom Auth Path

If you want to read a different auth file, pass a custom config:

from pathlib import Path

from codex_auth_helper import CodexAuthConfig, create_codex_responses_model

config = CodexAuthConfig(auth_path=Path("/tmp/codex-auth.json"))
model = create_codex_responses_model(
    "gpt-5.4",
    config=config,
    instructions="You are a helpful coding assistant.",
)

Passing Extra OpenAI Responses Settings

Additional OpenAIResponsesModelSettings can still be passed through. The helper keeps openai_store=False unless you explicitly override the model after construction.

from codex_auth_helper import create_codex_responses_model

model = create_codex_responses_model(
    "gpt-5.4",
    instructions="You are a helpful coding assistant.",
    settings={
        "openai_reasoning_summary": "concise",
    },
)

Lower-Level Client Factory

If you only want the authenticated OpenAI client, use create_codex_async_openai(...):

from codex_auth_helper import create_codex_async_openai

client = create_codex_async_openai()

This returns CodexAsyncOpenAI, a subclass of openai.AsyncOpenAI.

If you need the sync OpenAI client, use create_codex_openai(...).

Public API

from codex_auth_helper import (
    CodexAsyncOpenAI,
    CodexAuthConfig,
    CodexAuthState,
    CodexOpenAI,
    CodexAuthStore,
    CodexResponsesModel,
    CodexTokenManager,
    create_codex_async_openai,
    create_codex_chat_openai,
    create_codex_openai,
    create_codex_responses_model,
)

Errors

Typical failure modes:

  • Codex auth file was not found ... The machine is not logged into Codex yet.
  • Codex auth file ... does not contain valid JSON The auth file is corrupt or partially written.
  • ModelHTTPError ... Store must be set to false Means you are not using the helper-backed model instance.
  • ModelHTTPError ... Stream must be set to true Means you are not using CodexResponsesModel.

Package Notes

This package is intentionally small and focused:

  • auth file parsing
  • token refresh
  • Codex-specific OpenAI client wiring
  • pydantic-ai responses model factory
  • LangChain Responses-model factory

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codex_auth_helper-0.9.2.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codex_auth_helper-0.9.2-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file codex_auth_helper-0.9.2.tar.gz.

File metadata

  • Download URL: codex_auth_helper-0.9.2.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for codex_auth_helper-0.9.2.tar.gz
Algorithm Hash digest
SHA256 7909d1f6af1e238061e6c00a9c38dcf4d815e484e471ed256e0f8692a57b28c7
MD5 44f7087f970482b1b94408a1b47ecbed
BLAKE2b-256 fe2ed4f2af1d2930c4c47f14f979606901022da485b39440b0ea95cc409a6c49

See more details on using hashes here.

File details

Details for the file codex_auth_helper-0.9.2-py3-none-any.whl.

File metadata

  • Download URL: codex_auth_helper-0.9.2-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for codex_auth_helper-0.9.2-py3-none-any.whl
Algorithm Hash digest
SHA256 637713035a77fe6708960641f9f745e13f825dd3c43b958d352680a0c19b5806
MD5 34fa6652bcbc2a5bca7a1dbef0b6e712
BLAKE2b-256 15d09155743cada52c2554530d75c30d7e9d95138bd84deb88deb79ef27bca68

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page