Skip to main content

MCP server that exposes OpenAPI specifications as queryable documentation resources for LLMs, with Scalar deep links

Project description

Caitlyn OpenAPI MCP Server

MCP server that exposes OpenAPI specifications as queryable documentation resources for LLMs, with Scalar deep links.

Features

  • URL-based OpenAPI spec loading: Load specs from any URL, not just local files
  • $ref resolution: Automatically resolves all $ref references (including remote refs) using Prance
  • Scalar deep links: Every endpoint, schema, and security scheme includes a docs_url pointing to Scalar documentation
  • MCP resources: Expose spec structure for introspection
  • MCP tools: Search and query endpoints, schemas, and security schemes
  • Streamable HTTP: Built for Bedrock AgentCore integration

Installation

Using pip (from PyPI)

Once published, install from PyPI:

pip install caitlyn-openapi-mcp

Using uvx (recommended)

For isolated execution without global installation:

uvx caitlyn-openapi-mcp

From source

git clone https://github.com/caitlyn-ai/caitlyn-openapi-mcp.git
cd caitlyn-openapi-mcp
pip install -e .

Development installation

git clone https://github.com/caitlyn-ai/caitlyn-openapi-mcp.git
cd caitlyn-openapi-mcp
pip install -e ".[dev]"

Configuration

The server is configured via environment variables:

Required

  • OPENAPI_SPEC_URL: Full URL to the OpenAPI JSON/YAML specification
    • Example: https://api.example.com/openapi.json
    • Example: https://raw.githubusercontent.com/org/repo/main/openapi.yaml

Optional

  • DOCS_RENDERER: Documentation renderer type (default: "scalar")

    • Currently only "scalar" is supported
  • DOCS_BASE_URL: Base URL of the Scalar documentation UI

    • Example: https://api.example.com/docs
    • Example: https://api.example.com/scalar
    • If not provided, docs_url fields will be null
  • MCP_TRANSPORT: Transport mode (default: "stdio")

    • "stdio": For local development and Claude Desktop (default)
    • "streamable-http": For AWS Bedrock AgentCore deployment

MCP Client Setup

This server can be used with any MCP-compatible client. Below are configuration examples for popular clients.

Claude Desktop (stdio transport)

Claude Desktop uses stdio transport by default - no need to set MCP_TRANSPORT.

Add this configuration to your Claude Desktop config file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "openapi-docs": {
      "command": "python",
      "args": ["-m", "openapi_mcp.server"],
      "env": {
        "OPENAPI_SPEC_URL": "https://api.example.com/openapi.json",
        "DOCS_BASE_URL": "https://api.example.com/docs"
      }
    }
  }
}

Using uvx (recommended for isolated environments):

{
  "mcpServers": {
    "openapi-docs": {
      "command": "uvx",
      "args": ["caitlyn-openapi-mcp"],
      "env": {
        "OPENAPI_SPEC_URL": "https://api.example.com/openapi.json",
        "DOCS_BASE_URL": "https://api.example.com/docs"
      }
    }
  }
}

The server automatically uses stdio transport for Claude Desktop communication.

Cline (VS Code Extension)

Add to your MCP settings in VS Code:

File: .vscode/mcp.json or global settings

{
  "mcpServers": {
    "openapi-docs": {
      "command": "python",
      "args": ["-m", "openapi_mcp.server"],
      "env": {
        "OPENAPI_SPEC_URL": "https://api.example.com/openapi.json",
        "DOCS_BASE_URL": "https://api.example.com/docs"
      }
    }
  }
}

AWS Bedrock AgentCore (streamable-http transport)

For Bedrock AgentCore deployment, set MCP_TRANSPORT=streamable-http:

Docker Deployment

# Dockerfile with Bedrock configuration
FROM your-base-image

ENV OPENAPI_SPEC_URL=https://api.example.com/openapi.json
ENV DOCS_BASE_URL=https://api.example.com/docs
ENV MCP_TRANSPORT=streamable-http

CMD ["caitlyn-openapi-mcp"]

docker-compose.yml

version: '3.8'

services:
  openapi-mcp:
    build: .
    ports:
      - "8000:8000"
    environment:
      - OPENAPI_SPEC_URL=https://api.example.com/openapi.json
      - DOCS_BASE_URL=https://api.example.com/docs
      - MCP_TRANSPORT=streamable-http

Programmatic Usage

import os
from openapi_mcp.server import create_server

# Set transport mode for Bedrock
os.environ["MCP_TRANSPORT"] = "streamable-http"
os.environ["OPENAPI_SPEC_URL"] = "https://api.example.com/openapi.json"
os.environ["DOCS_BASE_URL"] = "https://api.example.com/docs"

mcp = create_server()
mcp.run()  # Uses streamable-http from env

Deploy as a containerized service and configure your Bedrock agent to connect to the HTTP endpoint.

Generic MCP Client Configuration

For any MCP client that supports command-based servers:

{
  "command": "/path/to/python",
  "args": ["-m", "openapi_mcp.server"],
  "env": {
    "OPENAPI_SPEC_URL": "https://your-api.com/openapi.json",
    "DOCS_RENDERER": "scalar",
    "DOCS_BASE_URL": "https://your-api.com/docs"
  }
}

Example: Caitlyn API

Configure for the Caitlyn API with Scalar documentation:

{
  "mcpServers": {
    "caitlyn-api": {
      "command": "uvx",
      "args": ["caitlyn-openapi-mcp"],
      "env": {
        "OPENAPI_SPEC_URL": "https://betty.getcaitlyn.ai/docs/openapi-v1.json",
        "DOCS_BASE_URL": "https://betty.getcaitlyn.ai/api/docs"
      }
    }
  }
}

Multiple API Configurations

You can configure multiple OpenAPI specs by running separate server instances:

{
  "mcpServers": {
    "api-production": {
      "command": "uvx",
      "args": ["caitlyn-openapi-mcp"],
      "env": {
        "OPENAPI_SPEC_URL": "https://api.prod.example.com/openapi.json",
        "DOCS_BASE_URL": "https://docs.prod.example.com"
      }
    },
    "api-staging": {
      "command": "uvx",
      "args": ["caitlyn-openapi-mcp"],
      "env": {
        "OPENAPI_SPEC_URL": "https://api.staging.example.com/openapi.json",
        "DOCS_BASE_URL": "https://docs.staging.example.com"
      }
    }
  }
}

Troubleshooting Client Setup

Issue: Server fails to start in Claude Desktop

Solutions:

  • Verify Python is in your PATH: which python (macOS/Linux) or where python (Windows)
  • Use full path to Python: "/usr/local/bin/python3.11"
  • Check Claude Desktop logs: ~/Library/Logs/Claude/mcp*.log (macOS)
  • Ensure OPENAPI_SPEC_URL is accessible from your machine

Issue: "ModuleNotFoundError: No module named 'openapi_mcp'"

Solutions:

  • Install the package: pip install caitlyn-openapi-mcp
  • Or use uvx for automatic installation: uvx caitlyn-openapi-mcp
  • Verify installation: python -m openapi_mcp.server --help

Issue: Tools not appearing in Claude Desktop

Solutions:

  • Restart Claude Desktop completely
  • Check the server is running: Look for the process in Activity Monitor/Task Manager
  • Verify configuration JSON is valid (use a JSON validator)
  • Check server logs for errors

Usage

Running the server

# Set required environment variables
export OPENAPI_SPEC_URL="https://api.example.com/openapi.json"
export DOCS_BASE_URL="https://api.example.com/docs"

# Run the server
caitlyn-openapi-mcp

Using Docker

docker build -t caitlyn-openapi-mcp .

docker run -p 8000:8000 \
  -e OPENAPI_SPEC_URL="https://api.example.com/openapi.json" \
  -e DOCS_BASE_URL="https://api.example.com/docs" \
  caitlyn-openapi-mcp

MCP Resources

The server exposes one static resource:

api-specification

The complete OpenAPI 3.x specification in JSON format (fully resolved with all $refs expanded). Can be used with OpenAPI validation tools, code generators, or for reference.

MCP Tools

The server provides tools designed to help LLMs answer user questions about the API. Each tool includes contextual descriptions to guide when it should be used.

list_api_endpoints

Use for: Getting an overview of what the API can do, or finding endpoints by category.

Parameters:

  • tag (optional): Filter by API category/tag (e.g., "users", "posts", "auth")
  • search (optional): Search term to find endpoints (searches paths, descriptions, summaries)

Returns: List of endpoints with path, method, summary, description, tags, and docs_url

Example use cases:

  • User asks: "What can this API do?"
  • User asks: "Show me all user-related endpoints"

get_endpoint_details

Use for: Getting detailed information about a specific endpoint including parameters, request body, and responses.

Parameters:

  • method: HTTP method (GET, POST, PUT, DELETE, PATCH, etc.)
  • path: API path (e.g., "/api/v1/users" or "/users/{userId}")

Returns: Complete endpoint details including parameters, request body schema, response schemas, and docs_url

Example use cases:

  • User asks: "How do I call the create user endpoint?"
  • User asks: "What parameters does the GET /users endpoint need?"
  • User asks: "What's the request body for creating a post?"

get_schema_definition

Use for: Understanding the structure of request/response data models.

Parameters:

  • schema_name: Name of the schema (e.g., "User", "CreateUserRequest", "PaginatedResponse")

Returns: Schema definition with properties, types, required fields, and docs_url

Example use cases:

  • User asks: "What fields does a User object have?"
  • User asks: "What's the structure of the CreatePostRequest?"
  • User asks: "What does the response look like?"

search_api_endpoints

Use for: Finding endpoints by functionality when you don't know the exact path.

Parameters:

  • query: What the user wants to do (e.g., "create knowledge base", "upload file", "get user profile")
  • max_results (optional, default: 20): Maximum number of results to return

Returns: Matching endpoints with path, method, summary, description, tags, and docs_url

Example use cases:

  • User asks: "How do I create a knowledge base through the API?"
  • User asks: "Can I upload files?"
  • User asks: "Is there an endpoint for user authentication?"

list_api_tags

Use for: Understanding how the API is organized into functional categories.

Parameters: None

Returns: List of tags/categories with endpoint counts

Example use cases:

  • User asks: "What functional areas does this API cover?"
  • User asks: "How is this API organized?"

Scalar Deep Links

When DOCS_BASE_URL is configured, the server generates deep links to Scalar documentation:

Endpoint links

Format: {base_url}#tag/{tag}/{method}/{path}

Example: https://api.example.com/docs#tag/users/get/api/v1/users

  • tag: The first tag on the operation (defaults to "default" if no tags)
  • method: HTTP method in lowercase (get, post, etc.)
  • path: OpenAPI path with leading slash stripped

Schema links

Format: {base_url}#schema/{schemaName}

Example: https://api.example.com/docs#schema/User

Security scheme links

Format: {base_url}#security/{schemeName}

Example: https://api.example.com/docs#security/bearerAuth

Development

Running tests

pytest

Code formatting

black src tests
ruff check src tests

Type checking

pyright

Running with coverage

pytest --cov=src --cov-report=html

Architecture

The server is built with the following components:

  • config.py: Environment-based configuration
  • model.py: Data models for endpoints, schemas, and the OpenAPI index
  • openapi_loader.py: URL-based OpenAPI spec loading using Prance and openapi-core
  • docs_links.py: Documentation deep link generation (currently Scalar only)
  • resources.py: MCP resource definitions
  • tools.py: MCP tool definitions
  • server.py: Main server wiring and entry point

License

[Your license here]

Contributing

[Your contributing guidelines here]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

caitlyn_openapi_mcp-0.1.2.tar.gz (16.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

caitlyn_openapi_mcp-0.1.2-py3-none-any.whl (14.1 kB view details)

Uploaded Python 3

File details

Details for the file caitlyn_openapi_mcp-0.1.2.tar.gz.

File metadata

  • Download URL: caitlyn_openapi_mcp-0.1.2.tar.gz
  • Upload date:
  • Size: 16.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for caitlyn_openapi_mcp-0.1.2.tar.gz
Algorithm Hash digest
SHA256 8cbc55027265c0956b9fd72f204d363231f587399ce7e2064d9e7763f4c902c4
MD5 7e4090d5f507fe8d22710da11f48f597
BLAKE2b-256 adfb1ec43a174367cc97ec12fd71b41f9b2205cda6a8f152452a125e2718e289

See more details on using hashes here.

File details

Details for the file caitlyn_openapi_mcp-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for caitlyn_openapi_mcp-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5b1a74c10fdabd3f14ef901e8c3cf3140027f2ea764d3d25bbd37c668ec31fde
MD5 3e957a59ca4bd2a52feeda20942f7adb
BLAKE2b-256 b0e43d13a39e945431c9f3526c5d0a6d488f4172c9f86afce59b5ec530f7d90b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page