Skip to main content

Minimal FastMCP server that wraps the OpenAI Codex CLI

Project description

CodexMCP

What is CodexMCP?

CodexMCP is a service that gives your applications access to AI coding capabilities without needing to build complex integrations. It's a server that exposes powerful code-related AI tools through a simple, standardized API.

Important: CodexMCP is not an autonomous agent - it's a tool provider that responds to specific requests. Your application remains in control, making specific requests for code generation, refactoring, or documentation as needed.

Think of CodexMCP as a bridge between your application and OpenAI's powerful AI coding capabilities. You send structured requests to the server (like "generate Python code that sorts a list"), and it returns the requested code or documentation.

A minimal FastMCP server wrapping the OpenAI Codex CLI to provide AI code generation, refactoring, and documentation capabilities through a standardized API.

Installation

  1. Prerequisites:

    • Node.js 18 LTS or later
    • Python 3.10 or later
    • Codex CLI installed globally:
      npm install -g @openai/codex
      
  2. Install CodexMCP:

    pip install codexmcp
    

    For development, install with test dependencies:

    pip install -e .[test]
    
  3. Environment Setup:

    • Create a .env file in your project root
    • Add your OpenAI API key:
      OPENAI_API_KEY=sk-your-key-here
      
    • Optional environment variables:
      • CODEXMCP_DEFAULT_MODEL: Default model to use (default: "o4-mini")
      • CODEXMCP_LOG_LEVEL: Logging level (default: INFO)
      • CODEXMCP_CONSOLE_LOG: Enable console logging (default: true)

Usage

Running the Server

Start the CodexMCP server with one simple command:

python -m codexmcp.server

or use the convenient entry point:

codexmcp

The server will start listening on port 8080 (by default). Your applications can now make requests to the server's API endpoints.

How It Works

  1. Your Application makes a request to a specific CodexMCP endpoint (like /tools/generate_code)
  2. CodexMCP Server processes the request and sends it to the OpenAI model
  3. OpenAI Model generates the requested code or documentation
  4. CodexMCP Server returns the result to your application

This approach gives you the power of AI coding assistance while keeping your application in control of when and how to use it.

Available Tools

CodexMCP provides the following AI-powered tools:

  1. generate_code: Generate code in any programming language

    • description: Task description
    • language: Programming language (default: "Python")
  2. refactor_code: Improve existing code

    • code: Source code to refactor
    • instruction: How to refactor the code
  3. write_tests: Generate unit tests for code

    • code: Source code to test
    • description: Additional testing requirements
  4. explain_code: Explain code functionality and structure

    • code: Source code to explain
    • detail_level: Level of detail ("brief", "medium", "detailed")
  5. generate_docs: Create documentation for code

    • code: Source code to document
    • doc_format: Output format ("docstring", "markdown", "html")
  6. write_openai_agent: Generate an OpenAI Agent implementation

    • name: Agent name
    • instructions: Agent system prompt
    • tool_functions: List of tool descriptions
    • description: Additional agent details
  7. generate_api_docs: Generate API documentation or client code

    • code: API implementation code
    • framework: Web framework used (default: "FastAPI")
    • output_format: Output format ("openapi", "swagger", "markdown", "code")
    • client_language: Language for client code (when output_format is "code")

Example Client

import asyncio
from fastmcp import MCPClient

async def main():
    client = MCPClient("http://localhost:8080")
    
    # Generate some Python code
    code = await client.generate_code(
        description="Create a function to calculate Fibonacci numbers",
        language="Python"
    )
    print(code)
    
    # Generate API documentation
    api_code = """
    from fastapi import FastAPI, Query
    
    app = FastAPI()
    
    @app.get("/items/")
    async def read_items(q: str = Query(None, min_length=3, max_length=50)):
        results = {"items": [{"item_id": "Foo"}, {"item_id": "Bar"}]}
        if q:
            results["items"] = [item for item in results["items"] if q in item["item_id"]]
        return results
    """
    
    docs = await client.generate_api_docs(
        code=api_code,
        framework="FastAPI",
        output_format="openapi"
    )
    print(docs)

if __name__ == "__main__":
    asyncio.run(main())

Testing

Run tests with pytest:

# Run all tests
pytest

# Run a specific test
pytest tests/test_tools.py::TestGenerateCode

# Test with coverage
pytest --cov=codexmcp

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codexmcp-0.1.4.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codexmcp-0.1.4-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file codexmcp-0.1.4.tar.gz.

File metadata

  • Download URL: codexmcp-0.1.4.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for codexmcp-0.1.4.tar.gz
Algorithm Hash digest
SHA256 a83aa02076e4afa01ba5d26fd595a43640a21513bc86f6b9ee46529569cf376f
MD5 46a20e185a4b0c126bbe51aeaad1806f
BLAKE2b-256 9ba9278f6fc5f74aaa320aa4dd83e5ae08395bd6882c8cb8dec41e2622265be1

See more details on using hashes here.

File details

Details for the file codexmcp-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: codexmcp-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for codexmcp-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 2d574b5241cea1fa4693dc40d519f4509bbb98bd3ed65881eaf5d47f237344da
MD5 18bfeb048b4d35a5b8bc5293a0af110c
BLAKE2b-256 b4c6705df8576ec71626646e71924ea0892611f6a4b7fbd3938a23d7cb02b285

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page