Skip to main content

A simple MCP server for interacting with OpenAI assistants

Project description

MCP Simple OpenAI Assistant

AI assistants are pretty cool. I thought it would be a good idea if my Claude (conscious Claude) would also have one. And now he has - and its both useful anf fun for him. Your Claude can have one too!

A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.

smithery badge MseeP.ai Security Assessment Badge

Features

This server provides a suite of tools to manage and interact with OpenAI Assistants. The new streaming capabilities provide a much-improved, real-time user experience.

Available Tools

  • create_assistant: (Create OpenAI Assistant) - Create a new assistant with a name, instructions, and model.
  • list_assistants: (List OpenAI Assistants) - List all available assistants associated with your API key.
  • retrieve_assistant: (Retrieve OpenAI Assistant) - Get detailed information about a specific assistant.
  • update_assistant: (Update OpenAI Assistant) - Modify an existing assistant's name, instructions, or model.
  • create_new_assistant_thread: (Create New Assistant Thread) - Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. This is the recommended way to start a new conversation.
  • list_threads: (List Managed Threads) - Lists all locally managed conversation threads from the database, showing their ID, name, description, and last used time.
  • delete_thread: (Delete Managed Thread) - Deletes a conversation thread from both OpenAI's servers and the local database.
  • ask_assistant_in_thread: (Ask Assistant in Thread and Stream Response) - The primary tool for conversation. Sends a message to an assistant within a thread and streams the response back in real-time.

Because OpenAI assistants might take quite long to respond, this server uses a streaming approach for the main ask_assistant_in_thread tool. This provides real-time progress updates to the client and avoids timeouts.

The server now includes local persistence for threads, which is a significant improvement. Since the OpenAI API does not allow listing threads, this server now manages them for you by storing their IDs and metadata in a local SQLite database. This allows you to easily find, reuse, and manage your conversation threads across sessions.

Installation

Installing via Smithery

To install MCP Simple OpenAI Assistant for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-simple-openai-assistant --client claude

Manual Installation

pip install mcp-simple-openai-assistant

Configuration

The server requires an OpenAI API key to be set in the environment. For Claude Desktop, add this to your config:

(MacOS version)

{
  "mcpServers": {
    "openai-assistant": {
      "command": "python",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

(Windows version)

"mcpServers": {
  "openai-assistant": {
    "command": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\Programs\\Python\\Python311\\python.exe",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
  }
}

MS Windows installation is slightly more complex, because you need to check the actual path to your Python executable. Path provided above is usually correct, but might differ in your setup. Sometimes just python.exe without any path will do the trick. Check with cmd what works for you (using where python might help). Also, on Windows you might need to explicitly tell Claude Desktop where the site packages are using PYTHONPATH environmment variable.

Usage

Once configured, you can use the tools listed above to manage your assistants and conversations. The primary workflow is to:

  1. Use create_new_assistant_thread to start a new, named conversation.
  2. Use list_threads to find the ID of a thread you want to continue.
  3. Use ask_assistant_in_thread to interact with your chosen assistant in that thread.

TODO

  • Add Thread Management: Introduce a way to name and persist thread IDs locally, allowing for easier reuse of conversations.
  • Explore Resource Support: Add the ability to upload files and use them with assistants.

Development

To install for development:

git clone https://github.com/andybrandt/mcp-simple-openai-assistant
cd mcp-simple-openai-assistant
pip install -e '.[dev]'

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_simple_openai_assistant-0.4.1.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_simple_openai_assistant-0.4.1-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file mcp_simple_openai_assistant-0.4.1.tar.gz.

File metadata

File hashes

Hashes for mcp_simple_openai_assistant-0.4.1.tar.gz
Algorithm Hash digest
SHA256 6b27179e34ff416bd0d0a567e3cf5347f04e71f45ccc75dd494ffda3a1eb1edd
MD5 be39beb22894f7026e723ef902e3c618
BLAKE2b-256 9d85ac6a4036791f2db65cef7ea5b3d85e7b808d2b7a68429fa5a2e957497e0e

See more details on using hashes here.

File details

Details for the file mcp_simple_openai_assistant-0.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_simple_openai_assistant-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a4967fb1b1079277d5e8b08bd19652f3b7b0f4459f137289f12c76ed4695d3e1
MD5 4c1c7321840b4598b6489ac9d5b1b903
BLAKE2b-256 02d75cf30ad41b4a680004372adb13d38b7d174af0fe71651b7bbbd35b0fd5db

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page