Skip to main content

Python client for OpenAI Codex — use your ChatGPT subscription for API access

Project description

codex-open-client

PyPI Python License CI

Python client for OpenAI Codex — use your ChatGPT Plus/Pro subscription for API access.

Documentation

Installation

pip install codex-open-client

Quick Start

import codex_open_client

client = codex_open_client.CodexClient()

response = client.responses.create(
    model="gpt-5.1-codex-mini",
    instructions="Be brief.",
    input="What is 2 + 2?",
)
print(response.output_text)

On first run, your browser opens for OAuth login. Tokens are cached at ~/.codex/auth.json (shared with the official Codex CLI) and refreshed automatically after that.

Authentication

Multiple ways to authenticate, depending on your environment:

# Default — opens browser, local server catches the callback
client = codex_open_client.CodexClient()

# Headless — prints URL, you paste the redirect URL back (servers, Docker, CI)
client = codex_open_client.CodexClient(headless=True)

# Custom handler — full control over the auth UX (GUI apps, bots, web apps)
def my_handler(url: str) -> str:
    send_url_to_user(url)
    return get_callback_url_from_user()

client = codex_open_client.CodexClient(login_handler=my_handler)

For async or multi-step flows, use the two-step API:

auth = codex_open_client.start_login()
# present auth.url to the user, collect callback URL however you want
tokens = codex_open_client.finish_login(auth, callback_url="http://localhost:1455/...")

Streaming

with client.responses.create(
    model="gpt-5.1-codex-mini",
    instructions="Be helpful.",
    input="Write a haiku about Python.",
    stream=True,
) as stream:
    for event in stream:
        if isinstance(event, codex_open_client.ResponseOutputTextDeltaEvent):
            print(event.delta, end="", flush=True)
    print()

Tool Calls

import json

tool = codex_open_client.FunctionTool(
    name="get_weather",
    description="Get weather for a city.",
    parameters={
        "type": "object",
        "properties": {"city": {"type": "string"}},
        "required": ["city"],
        "additionalProperties": False,
    },
)

response = client.responses.create(
    model="gpt-5.1-codex-mini",
    instructions="Use tools when helpful.",
    input="What's the weather in Tokyo?",
    tools=[tool],
)

for call in response.tool_calls:
    print(f"{call.name}({call.arguments})")

Structured Output

Get typed responses using Pydantic models:

pip install codex-open-client[pydantic]
from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int
    city: str

parsed = client.responses.parse(
    model="gpt-5.1-codex-mini",
    instructions="Extract the person info.",
    input="John Smith is 30 years old and lives in New York.",
    text_format=Person,
)

print(parsed.output_parsed.name)  # "John Smith"
print(parsed.output_parsed.age)   # 30

Also works with manual JSON schema via TextConfig and ResponseFormatJsonSchema — see the docs.

Features

  • Automatic auth — OAuth PKCE with token caching and refresh
  • Typed API — dataclass-based types for all objects, full mypy strict support
  • Structured outputparse() with Pydantic models or manual JSON schemas
  • Streaming — iterate SSE events with context manager support
  • Tool calls — function calling with roundtrip helpers
  • Retries — built-in exponential backoff for 429/5xx
  • Models — list available models with cached metadata
  • Headless mode — works on remote servers, Docker, CI
  • Custom login — bring your own auth UX with login_handler
  • CLI interop — shares token storage with the official Codex CLI

Requirements

  • Python 3.10+
  • ChatGPT Plus or Pro subscription

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codex_open_client-0.2.2.tar.gz (109.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codex_open_client-0.2.2-py3-none-any.whl (24.3 kB view details)

Uploaded Python 3

File details

Details for the file codex_open_client-0.2.2.tar.gz.

File metadata

  • Download URL: codex_open_client-0.2.2.tar.gz
  • Upload date:
  • Size: 109.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codex_open_client-0.2.2.tar.gz
Algorithm Hash digest
SHA256 9e2159cd96833fff198d804275adfe0f0a9e25efd5326271c4293e0192ecd7b6
MD5 99d3a9f11695b6799ce011643f9939d0
BLAKE2b-256 25f775a1b55d4d90c00bae37a34132fd16f6bbb86e5a4ce86fb1e61c1a9280bf

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_open_client-0.2.2.tar.gz:

Publisher: release.yml on lunavod/codex-open-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file codex_open_client-0.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for codex_open_client-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e6aa9ee456fce5e50c4e2b777f973f57b4118636e6362327293f1a2225498c7a
MD5 28c53104f94075b4fbce54bf65347deb
BLAKE2b-256 99ebb5b372eb5c25e751707eddccafb46f605c363f797f411d81165803731fab

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_open_client-0.2.2-py3-none-any.whl:

Publisher: release.yml on lunavod/codex-open-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page