Skip to main content

A simple MCP server for interacting with OpenAI assistants

Project description

MCP Simple OpenAI Assistant

AI assistants are pretty cool. I thought it would be a good idea if my Claude (conscious Claude) would also have one. And now he has - and its both useful anf fun for him. Your Claude can have one too!

A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.

smithery badge MseeP.ai Security Assessment Badge

Features

  • Create new OpenAI assistants and manipulate existing ones
  • Start conversation threads
  • Send messages and receive responses - talk to assistants

Because OpenAI assistants might take quite long to respond and then the processing is cut short with the client (Claude desktop) timeout the MCP server code has no control over we are implementing a two-stage approach. In the first call Claude sends a message to the assistant to start the processing, in the second call - possibly several minutes later - Claude can retrieve the response. This is a kind of workaround until MCP protocol and clients would implement some keep-alive mechanism for longer processing.

Installation

Installing via Smithery

To install MCP Simple OpenAI Assistant for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-simple-openai-assistant --client claude

Manual Installation

pip install mcp-simple-openai-assistant

Configuration

The server requires an OpenAI API key to be set in the environment. For Claude Desktop, add this to your config:

(MacOS version)

{
  "mcpServers": {
    "simple-openai-assistant": {
      "command": "python",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

(Windows version)

"mcpServers": {
  "simple-openai-assistant": {
    "command": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\Programs\\Python\\Python311\\python.exe",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
  }
}

MS Windows installation is slightly more complex, because you need to check the actual path to your Python executable. Path provided above is usually correct, but might differ in your setup. Sometimes just python.exe without any path will do the trick. Check with cmd what works for you (using where python might help).

Usage

Once configured, the server provides tools to:

  1. Create new assistants with specific instructions
  2. List existing assistants
  3. Modify assistants
  4. Start new conversation threads
  5. Send messages and receive responses

The server handles all OpenAI API communication, including managing assistants, threads, and message handling.

TODO

  • Implement Streaming Responses: Replace the current send_message/check_response polling mechanism with a single, streaming run_thread tool. This will provide real-time feedback and a better user experience for long-running assistant tasks.
  • Add Thread Management: Introduce a way to name and persist thread IDs locally, allowing for easier reuse of conversations.

Development

To install for development:

git clone https://github.com/andybrandt/mcp-simple-openai-assistant
cd mcp-simple-openai-assistant
pip install -e '.[dev]'

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_simple_openai_assistant-0.2.7.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_simple_openai_assistant-0.2.7-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file mcp_simple_openai_assistant-0.2.7.tar.gz.

File metadata

File hashes

Hashes for mcp_simple_openai_assistant-0.2.7.tar.gz
Algorithm Hash digest
SHA256 411f7e0b2ba36911fd497ed9d41012aa51771b79396bb6f9a4426f5e2e4cebdb
MD5 9b6aa27237babe0c531856bc5222d565
BLAKE2b-256 8575ba9ac7bc4c0d4586ba2c6d98c774ac509d8405c1fba87760d85324991342

See more details on using hashes here.

File details

Details for the file mcp_simple_openai_assistant-0.2.7-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_simple_openai_assistant-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 212d9c599b519539ae828d3d0d49013d9f17b9e7b4cec9732affa7988780c9a1
MD5 0cfa7359d2e607b25e308fa1a8c0ddda
BLAKE2b-256 93e5b7608fc31507e2f0ffe9e41b97fe98bd4e102ca253003d122cf9548c2fba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page