Skip to main content

Read and write to Pinecone from Claude Desktop with Model Context Protocol.

Project description

Pinecone Model Context Protocol Server for Claude Desktop.

Read and write to a Pinecone index.

Components

flowchart TB
    subgraph Client["MCP Client (e.g., Claude Desktop)"]
        UI[User Interface]
    end

    subgraph MCPServer["MCP Server (pinecone-mcp)"]
        Server[Server Class]
        
        subgraph Handlers["Request Handlers"]
            ListRes[list_resources]
            ReadRes[read_resource]
            ListTools[list_tools]
            CallTool[call_tool]
            GetPrompt[get_prompt]
            ListPrompts[list_prompts]
        end
        
        subgraph Tools["Implemented Tools"]
            SemSearch[semantic-search]
            ReadDoc[read-document]
            UpsertDoc[upsert-document]
        end
    end

    subgraph PineconeService["Pinecone Service"]
        PC[Pinecone Client]
        subgraph PineconeFunctions["Pinecone Operations"]
            Search[search_records]
            Upsert[upsert_records]
            Fetch[fetch_records]
            List[list_records]
            Embed[generate_embeddings]
        end
        Index[(Pinecone Index)]
    end

    %% Connections
    UI --> Server
    Server --> Handlers
    
    ListTools --> Tools
    CallTool --> Tools
    
    Tools --> PC
    PC --> PineconeFunctions
    PineconeFunctions --> Index
    
    %% Data flow for semantic search
    SemSearch --> Search
    Search --> Embed
    Embed --> Index
    
    %% Data flow for document operations
    UpsertDoc --> Upsert
    ReadDoc --> Fetch
    ListRes --> List

    classDef primary fill:#2563eb,stroke:#1d4ed8,color:white
    classDef secondary fill:#4b5563,stroke:#374151,color:white
    classDef storage fill:#059669,stroke:#047857,color:white
    
    class Server,PC primary
    class Tools,Handlers secondary
    class Index storage

Resources

The server implements the ability to read and write to a Pinecone index.

Tools

  • semantic-search: Search for records in the Pinecone index.
  • read-document: Read a document from the Pinecone index.
  • upsert-document: Upsert a document into the Pinecone index.
  • process-document: Process a document into chunks and upsert them into the Pinecone index. This performs the overall steps of chunking, embedding, and upserting.
  • chunk-document: Chunk a document into chunks.
  • embed-document: Generate embeddings for a document using Pinecone's inference API.

Note: embeddings are generated via Pinecone's inference API and chunking is done with a rudimentary markdown splitter (via langchain).

Quickstart

Install the server

Recommend using uv to install the server locally for Claude.

uvx install mcp-pinecone

OR

uv pip install mcp-pinecone

Add your config as described below.

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Note: You might need to use the direct path to uv. Use which uv to find the path.

Development/Unpublished Servers Configuration

"mcpServers": {
  "mcp-pinecone": {
    "command": "uv",
    "args": [
      "--directory",
      "{project_dir}",
      "run",
      "mcp-pinecone"
    ]
  }
}

Published Servers Configuration

"mcpServers": {
  "mcp-pinecone": {
    "command": "uvx",
    "args": [
      "--index-name",
      "{your-index-name}",
      "--api-key",
      "{your-secret-api-key}",
      "mcp-pinecone"
    ]
  }
}

Sign up to Pinecone

You can sign up for a Pinecone account here.

Get an API key

Create a new index in Pinecone, replacing {your-index-name} and get an API key from the Pinecone dashboard, replacing {your-secret-api-key} in the config.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {project_dir} run mcp-pinecone

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Source Code

The source code is available on GitHub.

Contributing

Send your ideas and feedback to me on Bluesky or by opening an issue.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_pinecone-0.1.5.tar.gz (59.4 kB view details)

Uploaded Source

Built Distribution

mcp_pinecone-0.1.5-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file mcp_pinecone-0.1.5.tar.gz.

File metadata

  • Download URL: mcp_pinecone-0.1.5.tar.gz
  • Upload date:
  • Size: 59.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.7

File hashes

Hashes for mcp_pinecone-0.1.5.tar.gz
Algorithm Hash digest
SHA256 3f660cf90e34b8b2ef2221001878eecca5bc657392a1035d8cda5dd5788286e0
MD5 f2b952bb9ca83d7dff96d9f6c853af28
BLAKE2b-256 f3e82c92add09c0d6985257635f766fe5cf50837f56774a209f8576b5c987f78

See more details on using hashes here.

File details

Details for the file mcp_pinecone-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_pinecone-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 49ba99253af4c706b8be85c2d711e0b7cf213e609558c2400b461e065269eac0
MD5 7cb57385a2476ce9f12f553e310d1bac
BLAKE2b-256 2d7dcb6c063d5182e2e98c9967f4d17ddaa5e711cc5b5ae908d5a606218371ee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page