Skip to main content

Minimal FastMCP server that wraps the OpenAI Codex CLI

Project description

CodexMCP

Minimal FastMCP server that wraps the OpenAI Codex CLI and makes the model available over standard‑I/O so it can be consumed with mcp‑cli (or any other MCP‑compatible client).

Tools exposed by this server (all asynchronous):

  1. generate_code(description, language="Python", model="o4-mini")
  2. refactor_code(code, instruction, model="o4-mini")
  3. write_tests(code, description="", model="o4-mini")

Everything that the Codex subprocess prints (stdout and stderr) is recorded to ~/.codexmcp/logs/ with rotation (5 files × 5 MiB).


Installation (Linux/macOS)

Prerequisites

  1. Install Node 18 LTS and the Codex CLI globally:
npm install -g @openai/codex
  1. Create a Python 3.10+ virtual environment and activate it:
python3 -m venv .venv
source .venv/bin/activate
  1. Create a .env file in your working directory:
OPENAI_API_KEY=sk-<your-key>

Install CodexMCP

Install directly from GitHub:

pip install "git+https://github.com/tomascupr/codexMCP.git#egg=codexmcp"

Or, if published to PyPI (future):

pip install codexmcp

Running the server

Once installed, you can start the server in one of two ways:

  • Using the console script:

    codexmcp
    
  • Using Python's module mode:

    python -m codexmcp.server
    

The first request may take a couple of seconds while the model warms up; after that each call returns in ~0.5‑1.5 s.


Using mcp-cli

# List available tools (smoke-test: should answer <2 s)
 mcp-cli chat --server CodexMCP -q '["list_tools"]'

# Ask Codex to write a Rust hello-world program
mcp-cli chat --server CodexMCP -q \
    'mcp__CodexMCP__generate_code("hello world", "Rust")'

API examples

# Refactor some code
mcp-cli chat --server CodexMCP -q \
  'mcp__CodexMCP__refactor_code("print(1+1)", "convert to a function")'

# Generate PyTest tests
mcp-cli chat --server CodexMCP -q \
  'mcp__CodexMCP__write_tests("def fib(n): return 1 if n<2 else fib(n-1)+fib(n-2)")'

Troubleshooting

codex: command not found → ensure npm's global bin directory is on $PATH. • .env warnings → make sure your .env file is in your working directory. • Logs not written → check permissions for ~/.codexmcp. • Long delay before first answer → normal, model container has to warm up.


License

MIT‑0 – see LICENSE if present.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codexmcp-0.1.1.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codexmcp-0.1.1-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file codexmcp-0.1.1.tar.gz.

File metadata

  • Download URL: codexmcp-0.1.1.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for codexmcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 19fb28f4f5492b7eb3efa29b18377029b32461bd7b73fdc23e221000e1b64ac5
MD5 d6122b55cf08c4f3f92ed539a4c7761e
BLAKE2b-256 7787b1d9c50b968477c3d174ce357965ef576d36a264b96918506e866fa5b9a7

See more details on using hashes here.

File details

Details for the file codexmcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: codexmcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for codexmcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 53d1d55490722b6fd25c1710e2ed631fc1b0a9a1f585710a2edb429d43cf35e9
MD5 d7c5ad44b6f105ef26919ac6359142ba
BLAKE2b-256 ff594ef7dd53e4d2dd1e05917eaefd0631be4898721a1f5c5b94c1daa32c1dfb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page