Skip to main content

Dockerized MCP server for OpenAI models

Project description

openai_mcp_server MCP server

Dockerized MCP server for OpenAI models

Components

Resources

The server implements a simple note storage system with:

  • Custom note:// URI scheme for accessing individual notes
  • Each note resource has a name, description and text/plain mimetype

Prompts

The server provides a single prompt:

  • summarize-notes: Creates summaries of all stored notes
    • Optional "style" argument to control detail level (brief/detailed)
    • Generates prompt combining all current notes with style preference

Tools

The server implements one tool:

  • add-note: Adds a new note to the server
    • Takes "name" and "content" as required string arguments
    • Updates server state and notifies clients of resource changes

Configuration

[TODO: Add configuration details specific to your implementation]

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "openai_mcp_server": { "command": "uv", "args": [ "--directory", "Z:\FUCK", "run", "openai_mcp_server" ] } } ```
Published Servers Configuration ``` "mcpServers": { "openai_mcp_server": { "command": "uvx", "args": [ "openai_mcp_server" ] } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory Z:\FUCK run openai-mcp-server

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Building the Docker Image

docker build -t openai-mcp-server .

Running the Docker Container

docker run -d -p 8000:8000 -e OPENAI_API_KEY=your_api_key_here openai-mcp-server

Replace your_api_key_here with your actual OpenAI API key.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_mcp_server-1.0.0.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_mcp_server-1.0.0-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file openai_mcp_server-1.0.0.tar.gz.

File metadata

  • Download URL: openai_mcp_server-1.0.0.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.12

File hashes

Hashes for openai_mcp_server-1.0.0.tar.gz
Algorithm Hash digest
SHA256 bbc0a17a8793db4b0be5eec54ea57df01fd7db7a0db9b6cfadce710f72ede866
MD5 1111075d724cfe49335b31d6728303f6
BLAKE2b-256 928d6463a70494db0539a1fa0e18913083447e7278f5686cf97c929eef6fce4f

See more details on using hashes here.

File details

Details for the file openai_mcp_server-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_mcp_server-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 305d768f3e05df95c572a3306001d53aa45989e3ba162dd20d251de3a4b2e3d9
MD5 a0377924cf52e5f6c75bba950e2573fb
BLAKE2b-256 9a4d31442f30e34849832ad729c55e8818741c3df26ac1dd955e4b8a9279ba80

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page