Skip to main content

Agnostic LLM abstraction layer (OpenAI, Gemini, Anthropic) for Codex

Project description

codex-ai

PyPI version Python CI License

Agnostic LLM abstraction layer for the Codex ecosystem. It gives you one async prompt pipeline for OpenAI, Gemini, Anthropic, and OpenRouter, with routing and provider failover built in.


Install

# Core only
pip install codex-ai

# Provider extras
pip install "codex-ai[openai]"
pip install "codex-ai[gemini]"
pip install "codex-ai[anthropic]"
pip install "codex-ai[openai,gemini,anthropic]"

Requires Python 3.12 or newer.

Quick Start

Define a prompt once and dispatch it through any provider:

from codex_ai import LLMDispatcher, LLMMessage, LLMRouter, PromptResult, OpenAIProvider, GeminiProvider
from codex_ai.providers import MultiLLMProvider

router = LLMRouter()

@router.prompt("chat")
async def build_chat(text: str, **kw) -> PromptResult:
    return PromptResult(
        messages=[LLMMessage(role="user", content=text)],
        system="You are a helpful assistant.",
    )

# Multiple providers with automatic failover: OpenAI → Gemini
provider = MultiLLMProvider(
    providers={
        "openai": OpenAIProvider(api_key="sk-..."),
        "gemini": GeminiProvider(api_key="AIza..."),
    },
    default="openai",
    failover_list=["gemini"],  # if OpenAI fails — try Gemini
)

dispatcher = LLMDispatcher(provider=provider)
dispatcher.include_router(router)

# Uses default (OpenAI), falls back to Gemini on error
response = await dispatcher.process("chat", text="Hello!")

# Or pick a provider explicitly
response = await dispatcher.process("chat", text="Hello!", provider="gemini")

Modules

Module Extra Description
codex_ai.core Dispatcher, router, protocol types, sync wrapper, and shared exception contract
codex_ai.providers.openai [openai] OpenAI Chat Completions provider
codex_ai.providers.gemini [gemini] Google Gemini provider via google-genai
codex_ai.providers.anthropic_ [anthropic] Anthropic Claude provider
codex_ai.providers.openrouter [openai] OpenRouter provider built on the OpenAI-compatible SDK
codex_ai.providers.multi Multi-provider dispatcher with failover and model-based inference

Development

uv sync --extra dev
uv run pytest
uv run mypy src/
uv run pre-commit run --all-files
uv build --no-sources

Documentation

Full docs with architecture, API reference, and data flow diagrams:

codexdlc.github.io/codex-ai

Part of the Codex ecosystem

Package Role
codex-core Foundation — immutable DTOs, PII masking, env settings
codex-platform Infrastructure — Redis, Streams, ARQ workers, Notifications
codex-ai LLM layer — unified async interface for OpenAI, Gemini, Anthropic
codex-services Business logic — Booking engine, CRM, Calendar

Each library is fully standalone — install only what your project needs. Together they form the backbone of codex-bot (Telegram AI-agent infrastructure built on aiogram) and codex-django (Django integration layer).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codex_ai-0.1.0.tar.gz (127.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codex_ai-0.1.0-py3-none-any.whl (19.6 kB view details)

Uploaded Python 3

File details

Details for the file codex_ai-0.1.0.tar.gz.

File metadata

  • Download URL: codex_ai-0.1.0.tar.gz
  • Upload date:
  • Size: 127.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codex_ai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f67d83945c437fe58964210b0248b148df9bcd18aa456d59f0c0876f99aa0513
MD5 20c488992633428567bdb520844f94e9
BLAKE2b-256 33c58a2956b30eefe13300b02f3f4ead11c6393d9eb6dc261b09fe6c0823a721

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_ai-0.1.0.tar.gz:

Publisher: publish.yml on CodexDLC/codex-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file codex_ai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: codex_ai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 19.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codex_ai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 049cffa50a2c0e6be3a2146af2e4344502ccf80f595e2501ab20067afaa2f799
MD5 ac30f91383c321a45dc93ac5197e5b21
BLAKE2b-256 a2af1791163bdba3244a74536bab7d3e80055e1ad5c4c9d1915c8c139f39dcbe

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_ai-0.1.0-py3-none-any.whl:

Publisher: publish.yml on CodexDLC/codex-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page