Minimal FastMCP server that wraps the OpenAI Codex CLI
Project description
CodexMCP
What is CodexMCP?
CodexMCP is a service that gives your applications access to AI coding capabilities without needing to build complex integrations. It's a server that exposes powerful code-related AI tools through a simple, standardized API.
Important: CodexMCP is not an autonomous agent - it's a tool provider that responds to specific requests. Your application remains in control, making specific requests for code generation, refactoring, or documentation as needed.
Think of CodexMCP as a bridge between your application and OpenAI's powerful AI coding capabilities. You send structured requests to the server (like "generate Python code that sorts a list"), and it returns the requested code or documentation.
A minimal FastMCP server wrapping the OpenAI Codex CLI to provide AI code generation, refactoring, and documentation capabilities through a standardized API.
Installation
-
Prerequisites:
- Node.js 18 LTS or later
- Python 3.10 or later
- Codex CLI installed globally:
npm install -g @openai/codex
-
Install CodexMCP:
pip install codexmcp
For development, install with test dependencies:
pip install -e .[test]
-
Environment Setup:
- Create a
.envfile in your project root - Add your OpenAI API key:
OPENAI_API_KEY=sk-your-key-here - Optional environment variables:
CODEXMCP_DEFAULT_MODEL: Default model to use (default: "o4-mini")CODEXMCP_LOG_LEVEL: Logging level (default: INFO)CODEXMCP_CONSOLE_LOG: Enable console logging (default: true)
- Create a
Usage
Running the Server
Start the CodexMCP server with one simple command:
python -m codexmcp.server
or use the convenient entry point:
codexmcp
The server will start listening on port 8080 (by default). Your applications can now make requests to the server's API endpoints.
How It Works
- Your Application makes a request to a specific CodexMCP endpoint (like
/tools/generate_code) - CodexMCP Server processes the request and sends it to the OpenAI model
- OpenAI Model generates the requested code or documentation
- CodexMCP Server returns the result to your application
This approach gives you the power of AI coding assistance while keeping your application in control of when and how to use it.
Available Tools
CodexMCP provides the following AI-powered tools:
-
generate_code: Generate code in any programming language
description: Task descriptionlanguage: Programming language (default: "Python")
-
refactor_code: Improve existing code
code: Source code to refactorinstruction: How to refactor the code
-
write_tests: Generate unit tests for code
code: Source code to testdescription: Additional testing requirements
-
explain_code: Explain code functionality and structure
code: Source code to explaindetail_level: Level of detail ("brief", "medium", "detailed")
-
generate_docs: Create documentation for code
code: Source code to documentdoc_format: Output format ("docstring", "markdown", "html")
-
write_openai_agent: Generate an OpenAI Agent implementation
name: Agent nameinstructions: Agent system prompttool_functions: List of tool descriptionsdescription: Additional agent details
-
generate_api_docs: Generate API documentation or client code
code: API implementation codeframework: Web framework used (default: "FastAPI")output_format: Output format ("openapi", "swagger", "markdown", "code")client_language: Language for client code (when output_format is "code")
Example Client
import asyncio
from fastmcp import MCPClient
async def main():
client = MCPClient("http://localhost:8080")
# Generate some Python code
code = await client.generate_code(
description="Create a function to calculate Fibonacci numbers",
language="Python"
)
print(code)
# Generate API documentation
api_code = """
from fastapi import FastAPI, Query
app = FastAPI()
@app.get("/items/")
async def read_items(q: str = Query(None, min_length=3, max_length=50)):
results = {"items": [{"item_id": "Foo"}, {"item_id": "Bar"}]}
if q:
results["items"] = [item for item in results["items"] if q in item["item_id"]]
return results
"""
docs = await client.generate_api_docs(
code=api_code,
framework="FastAPI",
output_format="openapi"
)
print(docs)
if __name__ == "__main__":
asyncio.run(main())
Testing
Run tests with pytest:
# Run all tests
pytest
# Run a specific test
pytest tests/test_tools.py::TestGenerateCode
# Test with coverage
pytest --cov=codexmcp
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file codexmcp-0.1.4.tar.gz.
File metadata
- Download URL: codexmcp-0.1.4.tar.gz
- Upload date:
- Size: 16.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a83aa02076e4afa01ba5d26fd595a43640a21513bc86f6b9ee46529569cf376f
|
|
| MD5 |
46a20e185a4b0c126bbe51aeaad1806f
|
|
| BLAKE2b-256 |
9ba9278f6fc5f74aaa320aa4dd83e5ae08395bd6882c8cb8dec41e2622265be1
|
File details
Details for the file codexmcp-0.1.4-py3-none-any.whl.
File metadata
- Download URL: codexmcp-0.1.4-py3-none-any.whl
- Upload date:
- Size: 12.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d574b5241cea1fa4693dc40d519f4509bbb98bd3ed65881eaf5d47f237344da
|
|
| MD5 |
18bfeb048b4d35a5b8bc5293a0af110c
|
|
| BLAKE2b-256 |
b4c6705df8576ec71626646e71924ea0892611f6a4b7fbd3938a23d7cb02b285
|