Skip to main content

Codex auth helpers for pydantic-ai OpenAI Responses models.

Project description

codex-auth-helper

codex-auth-helper turns an existing local Codex auth session into either:

  • a pydantic-ai Responses model
  • a LangChain ChatOpenAI model pinned to the OpenAI Responses API

It reads ~/.codex/auth.json, refreshes access tokens when needed, builds Codex-specific OpenAI clients for the Responses endpoint, and returns either a ready-to-use CodexResponsesModel or a LangChain chat model.

What It Does

  • Reads tokens from ~/.codex/auth.json
  • Derives ChatGPT-Account-Id from the auth file or token claims
  • Refreshes expired access tokens with https://auth.openai.com/oauth/token
  • Writes refreshed tokens back to the auth file
  • Builds an OpenAI-compatible client pointed at https://chatgpt.com/backend-api/codex
  • Returns a pydantic-ai responses model that already applies the Codex backend requirements
  • Returns a LangChain ChatOpenAI model configured for the Responses API

The helper enforces two backend-specific behaviors for you:

  • openai_store=False
  • streamed responses even when pydantic-ai calls the non-streamed request() path

What It Does Not Do

  • It does not log you into Codex
  • It does not create ~/.codex/auth.json
  • It does not provide generic Chat Completions wiring
  • It does not replace pydantic-ai; it only provides a model/client factory

Install

uv add codex-auth-helper
pip install codex-auth-helper

For LangChain usage:

uv add "codex-auth-helper[langchain]"
pip install "codex-auth-helper[langchain]"

You also need an existing Codex auth session on the same machine:

~/.codex/auth.json

If you have not logged in yet:

codex login

Quick Start

from codex_auth_helper import create_codex_responses_model
from pydantic_ai import Agent

model = create_codex_responses_model(
    "gpt-5.4",
    instructions="You are a helpful coding assistant.",
)
agent = Agent(model)

result = agent.run_sync("Naber")
print(result.output)

LangChain Quick Start

from codex_auth_helper import create_codex_chat_openai
from langchain.agents import create_agent

graph = create_agent(
    model=create_codex_chat_openai(
        "gpt-5.4",
        instructions="You are a helpful coding assistant.",
    ),
    tools=[],
    name="codex-graph",
)

The LangChain helper returns langchain_openai.ChatOpenAI configured to:

  • use the Codex Responses endpoint
  • reuse local Codex auth state
  • keep use_responses_api=True
  • default to output_version="responses/v1"
  • require instructions= and pass it through to the Responses request

instructions is mandatory for create_codex_chat_openai(...). The helper does not provide an implicit system prompt for the LangChain path; callers must pass the behavior they want explicitly.

The same rule applies to create_codex_responses_model(...) on the Pydantic path. Pass the Codex system behavior to the helper directly instead of relying on a separate agent-level instruction just to seed the model.

Custom Auth Path

If you want to read a different auth file, pass a custom config:

from pathlib import Path

from codex_auth_helper import CodexAuthConfig, create_codex_responses_model

config = CodexAuthConfig(auth_path=Path("/tmp/codex-auth.json"))
model = create_codex_responses_model(
    "gpt-5.4",
    config=config,
    instructions="You are a helpful coding assistant.",
)

Passing Extra OpenAI Responses Settings

Additional OpenAIResponsesModelSettings can still be passed through. The helper keeps openai_store=False unless you explicitly override the model after construction.

from codex_auth_helper import create_codex_responses_model

model = create_codex_responses_model(
    "gpt-5.4",
    instructions="You are a helpful coding assistant.",
    settings={
        "openai_reasoning_summary": "concise",
    },
)

Lower-Level Client Factory

If you only want the authenticated OpenAI client, use create_codex_async_openai(...):

from codex_auth_helper import create_codex_async_openai

client = create_codex_async_openai()

This returns CodexAsyncOpenAI, a subclass of openai.AsyncOpenAI.

If you need the sync OpenAI client, use create_codex_openai(...).

Public API

from codex_auth_helper import (
    CodexAsyncOpenAI,
    CodexAuthConfig,
    CodexAuthState,
    CodexOpenAI,
    CodexAuthStore,
    CodexResponsesModel,
    CodexTokenManager,
    create_codex_async_openai,
    create_codex_chat_openai,
    create_codex_openai,
    create_codex_responses_model,
)

Errors

Typical failure modes:

  • Codex auth file was not found ... The machine is not logged into Codex yet.
  • Codex auth file ... does not contain valid JSON The auth file is corrupt or partially written.
  • ModelHTTPError ... Store must be set to false Means you are not using the helper-backed model instance.
  • ModelHTTPError ... Stream must be set to true Means you are not using CodexResponsesModel.

Package Notes

This package is intentionally small and focused:

  • auth file parsing
  • token refresh
  • Codex-specific OpenAI client wiring
  • pydantic-ai responses model factory
  • LangChain Responses-model factory

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codex_auth_helper-0.9.1.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codex_auth_helper-0.9.1-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file codex_auth_helper-0.9.1.tar.gz.

File metadata

  • Download URL: codex_auth_helper-0.9.1.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for codex_auth_helper-0.9.1.tar.gz
Algorithm Hash digest
SHA256 14a4a3effb50138b22996d104a14168103e543e16944e52183f420dcd8d077f7
MD5 06bb0f6e2d8e2d2748b6bc285663d8c9
BLAKE2b-256 87346fccae4060b47d88b55abd42b59c822149a7fc386aebdf3bc55f95cbb61e

See more details on using hashes here.

File details

Details for the file codex_auth_helper-0.9.1-py3-none-any.whl.

File metadata

  • Download URL: codex_auth_helper-0.9.1-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for codex_auth_helper-0.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9db2c26b8248e56881c188f3255a0f5e3a2058451009ec46c5298410065d6565
MD5 95ab6f015f6f7fc5c3adb1c27a7cecce
BLAKE2b-256 5d314be354cf699aea6bb9781fdd8928a6dd55ee4389c97ae92217ae7bae9cbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page