Minimal FastMCP server that wraps the OpenAI Codex CLI
Project description
CodexMCP
Minimal FastMCP server that wraps the OpenAI Codex CLI and makes the
model available over standard‑I/O so it can be consumed with mcp‑cli (or any
other MCP‑compatible client).
Tools exposed by this server (all asynchronous):
generate_code(description, language="Python", model="o4-mini")refactor_code(code, instruction, model="o4-mini")write_tests(code, description="", model="o4-mini")
Everything that the Codex subprocess prints (stdout and stderr) is recorded
to ~/.codexmcp/logs/ with rotation (5 files × 5 MiB).
Installation (Linux/macOS)
Prerequisites
- Install Node 18 LTS and the Codex CLI globally:
npm install -g @openai/codex
- Create a Python 3.10+ virtual environment and activate it:
python3 -m venv .venv
source .venv/bin/activate
- Create a
.envfile in your working directory:
OPENAI_API_KEY=sk-<your-key>
Install CodexMCP
Install directly from GitHub:
pip install "git+https://github.com/tomascupr/codexMCP.git#egg=codexmcp"
Or, if published to PyPI (future):
pip install codexmcp
Running the server
Once installed, you can start the server in one of two ways:
-
Using the console script:
codexmcp
-
Using Python's module mode:
python -m codexmcp.server
The first request may take a couple of seconds while the model warms up; after that each call returns in ~0.5‑1.5 s.
Using mcp-cli
# List available tools (smoke-test: should answer <2 s)
mcp-cli chat --server CodexMCP -q '["list_tools"]'
# Ask Codex to write a Rust hello-world program
mcp-cli chat --server CodexMCP -q \
'mcp__CodexMCP__generate_code("hello world", "Rust")'
API examples
# Refactor some code
mcp-cli chat --server CodexMCP -q \
'mcp__CodexMCP__refactor_code("print(1+1)", "convert to a function")'
# Generate PyTest tests
mcp-cli chat --server CodexMCP -q \
'mcp__CodexMCP__write_tests("def fib(n): return 1 if n<2 else fib(n-1)+fib(n-2)")'
Troubleshooting
• codex: command not found → ensure npm's global bin directory is on $PATH.
• .env warnings → make sure your .env file is in your working directory.
• Logs not written → check permissions for ~/.codexmcp.
• Long delay before first answer → normal, model container has to warm up.
License
MIT‑0 – see LICENSE if present.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file codexmcp-0.1.1.tar.gz.
File metadata
- Download URL: codexmcp-0.1.1.tar.gz
- Upload date:
- Size: 8.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
19fb28f4f5492b7eb3efa29b18377029b32461bd7b73fdc23e221000e1b64ac5
|
|
| MD5 |
d6122b55cf08c4f3f92ed539a4c7761e
|
|
| BLAKE2b-256 |
7787b1d9c50b968477c3d174ce357965ef576d36a264b96918506e866fa5b9a7
|
File details
Details for the file codexmcp-0.1.1-py3-none-any.whl.
File metadata
- Download URL: codexmcp-0.1.1-py3-none-any.whl
- Upload date:
- Size: 9.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
53d1d55490722b6fd25c1710e2ed631fc1b0a9a1f585710a2edb429d43cf35e9
|
|
| MD5 |
d7c5ad44b6f105ef26919ac6359142ba
|
|
| BLAKE2b-256 |
ff594ef7dd53e4d2dd1e05917eaefd0631be4898721a1f5c5b94c1daa32c1dfb
|