Skip to main content

Collection of MCP tools and Agents to work with the deepset AI platform. Create, debug or learn about pipelines on the platform. Useable from the CLI, Cursor, Claude Code, or other MCP clients.

Project description

MCP Server for the deepset AI platform

The deepset MCP server exposes tools that MCP clients like Claude or Cursor can use to interact with the deepset AI platform.

Agents can use these tools to:

  • develop and iterate on Pipelines or Indexes
  • debug Pipelines and Indexes
  • search the deepset AI platform documentation

Contents

GIF showing CLI interaction with the MCP server

Installation

Claude Desktop App

Prerequisites:

  • Claude Desktop App needs to be installed
  • You need to be on the Claude Pro, Team, Max, or Enterprise plan
  • You need an installation of Docker (Go here if you want to use uv instead of Docker)
  • You need an API key for the deepset platform

Steps:

  1. Go to: /Users/your_user/Library/Application Support/Claude (Mac)
  2. Either open or create claude_desktop_config.json
  3. Add the following json as your config (or update your existing config if you are already using other MCP servers)
{
  "mcpServers": {
    "deepset": {
      "command": "/usr/local/bin/docker",
      "args": [
        "run",
        "-i",
        "-e",
        "DEEPSET_WORKSPACE",
        "-e",
        "DEEPSET_API_KEY",
        "deepset/deepset-mcp-server:main"
      ],
      "env": {
       "DEEPSET_WORKSPACE":"<WORKSPACE>",
       "DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
     }

    }
  }
}
  1. Quit and start the Claude Desktop App
  2. The deepset server should appear in the "Search and Tools" menu (this takes a few seconds as the Docker image needs to be downloaded and started)

Screenshot of the Search and Tools menu in the Claude Desktop App with deepset server running.

Using uv instead of Docker

Running the server with uv gives you faster startup time and consumes slightly less resources on your system.

  1. Install uv if you don't have it yet
  2. Put the following into your claude_desktop_config.json
{
  "mcpServers": {
    "deepset": {
      "command": "uvx",
      "args": [
        "deepset-mcp"
      ],
      "env": {
       "DEEPSET_WORKSPACE":"<WORKSPACE>",
       "DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
     }

    }
  }
}

This will load the deepset-mcp package from PyPi and install it into a temporary virtual environment.

  1. Quit and start the Claude Desktop App

Other MCP Clients

deepset-mcp can be used with other MCP clients.

Here is where you need to configure deepset-mcp for:

Generally speaking, depending on your installation, you need to configure an MCP client with one of the following commands:

uvx deepset-mcp --workspace your_workspace --api-key your_api_key

If you installed the deepset-mcp package globally and added it to your PATH, you can just run:

deepset-mcp --workspace your_workspace --api-key your_api_key

The server runs locally using stdio to communicate with the client.

Advanced Configuration

Tool Selection

You can customize which tools the MCP server should expose. Use the ´--tools-option in your config to explicitly specify which tools should be exposed.

You can list available tools with: deepset-mcp --list-tools.

To only expose the list_pipelines and get_pipeline tools you would use the following command:

deepset-mcp --tools list_pipelines get_pipeline

For smooth operations, you should always expose the get_from_object_store and get_slice_from_object_store tools.

Allowing access to multiple workspaces

The basic configuration uses a hardcoded workspace which you pass in via the DEEPSET_WORKSPACE environment variable. If you want to allow an agent to access resources from multiple workspaces, you can use --workspace-mode explicit in your config.

For example:

{
  "mcpServers": {
    "deepset": {
      "command": "uvx",
      "args": [
        "deepset-mcp",
        "--workspace-mode",
        "explicit"
      ],
      "env": {
       "DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
     }

    }
  }
}

An agent using the MCP server now has access to all workspaces that the API-key has access to. When interacting with most resources, you will need to tell the agent what workspace it should use to perform an action. Instead of prompting it with "list my pipelines", you would now have to prompt it with "list my pipelines in the staging workspace".

Prompts

All tools exposed through the MCP server have minimal prompts. Any Agent interacting with these tools benefits from an additional system prompt.

View the recommended prompt here.

This prompt is also exposed as the deepset_recommended_prompt on the MCP server. In Claude Desktop, click add from deepset to add the prompt to your context. A better way to add system prompts in Claude Desktop is through "Projects".

You can customize the system prompt to your specific needs.

Use Cases

The primary way to use the deepset MCP server is through an LLM that interacts with the deepset MCP tools in an agentic way.

Creating Pipelines

Tell the LLM about the type of pipeline you want to build. Creating new pipelines will work best if you use terminology that is similar to what is used on the deepset AI platform or in Haystack.

Your prompts should be precise and specific.

Examples:

  • "Build a RAG pipeline with hybrid retrieval that uses claude-sonnet-4 from Anthropic as the LLM."
  • "Build an Agent that can iteratively search the web (deep research). Use SerperDev for web search and GPT-4o as the LLM."

You can also instruct the LLM to deploy pipelines, and it can issue search requests against pipelines to test them.

Best Practices

  • be specific in your requests
  • point the LLM to examples, if there is already a similar pipeline in your workspace, then ask it to look at it first, if you have a template in mind, ask it to look at the template
  • instruct the LLM to iterate with you locally before creating the pipeline, have it validate the drafts and then let it create it once the pipeline is up to your standards

Debugging Pipelines

The deepset-mcp tools allow LLMs to debug pipelines on the deepset AI platform. Primary tools used for debugging are:

  • get_logs
  • validate_pipeline
  • search_pipeline
  • search_pipeline_templates
  • search_component_definition

You can ask the LLM to check the logs of a specific pipeline in case it is already deployed but has errors. The LLM will find errors in the logs and devise strategies to fix them. If your pipeline is not deployed yet, the LLM can autonomously validate it and fix validation errors.

CLI

You can use the MCP server as a Haystack Agent through a command-line interface.

Install with uvx tool install "deepset-mcp[cli]".

Start the interactive CLI with:

deepset agent chat

You can set environment variables before starting the Agent via:

export DEEPSET_API_KEY=your_key
export DEEPSET_WORKSPACE=your_workspace

You can also provide an .env file using the --env-file option:

deepset agent chat --env-file your/env/.file

The agent will load environment variables from the file on startup.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepset_mcp-0.0.3rc1.tar.gz (25.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepset_mcp-0.0.3rc1-py3-none-any.whl (157.8 kB view details)

Uploaded Python 3

File details

Details for the file deepset_mcp-0.0.3rc1.tar.gz.

File metadata

  • Download URL: deepset_mcp-0.0.3rc1.tar.gz
  • Upload date:
  • Size: 25.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for deepset_mcp-0.0.3rc1.tar.gz
Algorithm Hash digest
SHA256 88988d35cf822f894dcd6c88653837338f7d20841642bffd31670b3fe5399b79
MD5 dd9bc272efb1582e9eb3f8700e6337d2
BLAKE2b-256 25709702fdcc76596d2e9dd91f26c89ecde2ca44e0f3719476a63f7feada3bc0

See more details on using hashes here.

Provenance

The following attestation bundles were made for deepset_mcp-0.0.3rc1.tar.gz:

Publisher: pypi_release.yml on deepset-ai/deepset-mcp-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deepset_mcp-0.0.3rc1-py3-none-any.whl.

File metadata

  • Download URL: deepset_mcp-0.0.3rc1-py3-none-any.whl
  • Upload date:
  • Size: 157.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for deepset_mcp-0.0.3rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 26500d9f8e0a3c3c6b9ac8861e50d2ef4dd3d81d833aeaebd3c77885020db8ed
MD5 2139e0e6c7d706c04ef0588cdd237b25
BLAKE2b-256 bcbaa847720c89235c13cea9faec188d855c61cad968be5ed9371f2e8ddce46d

See more details on using hashes here.

Provenance

The following attestation bundles were made for deepset_mcp-0.0.3rc1-py3-none-any.whl:

Publisher: pypi_release.yml on deepset-ai/deepset-mcp-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page