Skip to main content

DSPy integration for ChatGPT Codex subscription-backed language models

Project description

dspy-codex-auth

DSPy integration for using ChatGPT/Codex subscription credentials as a DSPy language model.

This package is intentionally narrow:

  • It includes ChatGPT/Codex OAuth login, token refresh, and Pi-compatible credential storage.
  • It installs a DSPy LM wrapper for codex/... model strings.
  • It fixes Codex Responses streaming shapes that DSPy 3.2 cannot parse from the current Codex backend response stream.

Install

uv add dspy-codex-auth

Login

If you already have Codex credentials in ~/.pi/agent/auth.json, no extra login is needed. The package reads and refreshes that Pi-compatible credential file directly.

Otherwise:

uv run python -c "import dspy_codex_auth; dspy_codex_auth.login()"

Basic Usage

import dspy
import dspy_codex_auth

dspy_codex_auth.install()

lm = dspy.LM("codex/gpt-5.5", cache=False)
dspy.configure(lm=lm, adapter=dspy.JSONAdapter())

Use codex/<model> for the ChatGPT Codex subscription route. This is not the OpenAI API key route and does not require OPENAI_API_KEY.

cache=False is recommended for Codex while iterating because stale DSPy cache entries can preserve old empty-output responses across package upgrades.

Swapping Models

Call dspy_codex_auth.install() once near process startup. Codex model strings use subscription auth; non-Codex model strings continue through DSPy's normal LM behavior.

import dspy
import dspy_codex_auth

dspy_codex_auth.install()


def configure_model(model: str, **kwargs):
    lm = dspy.LM(model, **kwargs)
    dspy.configure(lm=lm, adapter=dspy.JSONAdapter())
    return lm


codex_lm = configure_model("codex/gpt-5.5", cache=False)
api_lm = configure_model("openai/gpt-5.5", api_key="...", cache=False)

Reasoning Summary

Pass reasoning_effort as usual. This package also supports reasoning_summary, which maps to the Responses API reasoning.summary field.

lm = dspy.LM(
    "codex/gpt-5.5",
    cache=False,
    reasoning_effort="medium",
    reasoning_summary="detailed",
)

DSPy predictions expose declared output fields. The lower-level LM history can also include a returned reasoning summary:

summary = lm.history[-1]["outputs"][0].get("reasoning_content")

OpenAI-Style Model String With Codex Auth

If you prefer to keep an openai/... model string and select Codex auth explicitly:

lm = dspy_codex_auth.LM(
    "openai/gpt-5.5",
    auth_provider="codex",
    cache=False,
    reasoning_effort="medium",
    reasoning_summary="detailed",
)

This is useful when the rest of your app treats model names as provider-neutral strings and you want auth selection to be a separate setting.

What It Fixes

The ChatGPT Codex backend streams useful output events, but the completed LiteLLM Responses object can arrive with response.output == []. DSPy expects Responses output items to contain final message text, function calls, and reasoning summaries. This package reconstructs those output items from stream events before DSPy parses the response.

It currently handles:

  • DSPy few-shot and conversation-history assistant messages by encoding them as Responses output_text blocks, which supports optimizers such as LabeledFewShot.
  • GEPA reflection calls that invoke the LM with a plain prompt string.
  • response.output_item.done
  • response.output_text.done
  • response.output_text.delta
  • response.reasoning_summary_text.done
  • response.reasoning_summary_text.delta
  • streamed function-call output items

It also strips output-token cap fields that the Codex backend currently rejects:

  • max_tokens
  • max_output_tokens
  • max_completion_tokens

French Example

import dspy
import dspy_codex_auth

dspy_codex_auth.install()

lm = dspy.LM("codex/gpt-5.5", cache=False)
dspy.configure(lm=lm, adapter=dspy.JSONAdapter())


class TranslateFrenchToEnglish(dspy.Signature):
    """Translate the French input into short, natural English."""

    french: str = dspy.InputField(desc="French sentence")
    english: str = dspy.OutputField(desc="Natural English translation")


translator = dspy.Predict(TranslateFrenchToEnglish)
print(translator(french="merci beaucoup").english)

Math Example With Reasoning Summary

import dspy
import dspy_codex_auth

dspy_codex_auth.install()

lm = dspy.LM(
    "codex/gpt-5.5",
    cache=False,
    reasoning_effort="medium",
    reasoning_summary="detailed",
)
dspy.configure(lm=lm, adapter=dspy.JSONAdapter())


class SolveMath(dspy.Signature):
    """Solve the math problem. Return a concise numeric answer and a brief explanation."""

    problem: str = dspy.InputField(desc="Math problem")
    answer: str = dspy.OutputField(desc="Concise final answer")
    explanation: str = dspy.OutputField(desc="Brief explanation")


solver = dspy.Predict(SolveMath)
pred = solver(
    problem=(
        "Compute the integral of the standard normal probability density "
        "function from 0 to 1.5."
    )
)

print(pred.answer)
print(pred.explanation)
print(lm.history[-1]["outputs"][0].get("reasoning_content"))

Attribution

dspy-codex-auth includes and adapts MIT-licensed auth and DSPy integration code from dspy-lm-auth:

https://github.com/MaximeRivest/dspy-lm-auth

The streamed-output reconstruction addresses a DSPy/Codex Responses streaming compatibility issue that was also discussed in dspy-lm-auth PR #2:

https://github.com/MaximeRivest/dspy-lm-auth/pull/2

dspy-lm-auth is MIT-licensed. The original copyright notice is preserved in THIRD_PARTY_NOTICES.md.

Development

uv sync --dev
uv run pytest
uv run ruff check .
uv build --no-sources

Release

Full PyPI update instructions are in RELEASING.md.

Short local release flow:

uv version --bump patch
rm -rf dist
uv build --no-sources
uv run --with twine python -m twine upload dist/*

PyPI releases are immutable, so every update needs a new version number.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dspy_codex_auth-0.1.2.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dspy_codex_auth-0.1.2-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file dspy_codex_auth-0.1.2.tar.gz.

File metadata

  • Download URL: dspy_codex_auth-0.1.2.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for dspy_codex_auth-0.1.2.tar.gz
Algorithm Hash digest
SHA256 2b6a821444d3e3e22796a8d82b967846c8bb9a0ae3a3033632f6d4ec9338781e
MD5 eec4f0118879f8aabeb29c1dc327f8be
BLAKE2b-256 9dbfbaa6666c8b6f57c679fdf6f03b4f01c79d26eb5154d181427f095d4a98c8

See more details on using hashes here.

File details

Details for the file dspy_codex_auth-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for dspy_codex_auth-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2e904b7907f133171b626a4b311c69bd15a3aaa74c806d6597ebdf1e2a0ef594
MD5 2e07fe4972fc6722b2bd4cb53e1d4d94
BLAKE2b-256 a42c3ac8b682af8e2183c42d96dd4657517e3f5a2c43f9123bd11187fe237ad9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page