Collection of MCP tools and Agents to work with the deepset AI platform. Create, debug or learn about pipelines on the platform. Useable from the CLI, Cursor, Claude Code, or other MCP clients.
Project description
MCP Server for the deepset AI platform
The deepset MCP server exposes tools that MCP clients like Claude or Cursor can use to interact with the deepset AI platform.
Agents can use these tools to:
- develop and iterate on Pipelines or Indexes
- debug Pipelines and Indexes
- search the deepset AI platform documentation
Contents
Installation
Claude Desktop App
Prerequisites:
- Claude Desktop App needs to be installed
- You need to be on the Claude Pro, Team, Max, or Enterprise plan
- You need an installation of Docker (Go here if you want to use
uvinstead of Docker) - You need an API key for the deepset platform
Steps:
- Go to:
/Users/your_user/Library/Application Support/Claude(Mac) - Either open or create
claude_desktop_config.json - Add the following json as your config (or update your existing config if you are already using other MCP servers)
{
"mcpServers": {
"deepset": {
"command": "/usr/local/bin/docker",
"args": [
"run",
"-i",
"-e",
"DEEPSET_WORKSPACE",
"-e",
"DEEPSET_API_KEY",
"deepset/deepset-mcp-server:main"
],
"env": {
"DEEPSET_WORKSPACE":"<WORKSPACE>",
"DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
}
}
}
}
- Quit and start the Claude Desktop App
- The deepset server should appear in the "Search and Tools" menu (this takes a few seconds as the Docker image needs to be downloaded and started)
Using uv instead of Docker
Running the server with uv gives you faster startup time and consumes slightly less resources on your system.
- Install uv if you don't have it yet
- Put the following into your
claude_desktop_config.json
{
"mcpServers": {
"deepset": {
"command": "uvx",
"args": [
"deepset-mcp"
],
"env": {
"DEEPSET_WORKSPACE":"<WORKSPACE>",
"DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
}
}
}
}
This will load the deepset-mcp package from PyPi and install it into a temporary virtual environment.
- Quit and start the Claude Desktop App
Other MCP Clients
deepset-mcp can be used with other MCP clients.
Here is where you need to configure deepset-mcp for:
Generally speaking, depending on your installation, you need to configure an MCP client with one of the following commands:
uvx deepset-mcp --workspace your_workspace --api-key your_api_key
If you installed the deepset-mcp package globally and added it to your PATH, you can just run:
deepset-mcp --workspace your_workspace --api-key your_api_key
The server runs locally using stdio to communicate with the client.
Advanced Configuration
Tool Selection
You can customize which tools the MCP server should expose.
Use the ´--tools-option in your config to explicitly specify which tools should be exposed.
You can list available tools with: deepset-mcp --list-tools.
To only expose the list_pipelines and get_pipeline tools you would use the following command:
deepset-mcp --tools list_pipelines get_pipeline
For smooth operations, you should always expose the get_from_object_store and get_slice_from_object_store tools.
Allowing access to multiple workspaces
The basic configuration uses a hardcoded workspace which you pass in via the DEEPSET_WORKSPACE environment variable.
If you want to allow an agent to access resources from multiple workspaces, you can use --workspace-mode explicit
in your config.
For example:
{
"mcpServers": {
"deepset": {
"command": "uvx",
"args": [
"deepset-mcp",
"--workspace-mode",
"explicit"
],
"env": {
"DEEPSET_API_KEY":"<DEEPSET_API_KEY>"
}
}
}
}
An agent using the MCP server now has access to all workspaces that the API-key has access to. When interacting with most resources, you will need to tell the agent what workspace it should use to perform an action. Instead of prompting it with "list my pipelines", you would now have to prompt it with "list my pipelines in the staging workspace".
Prompts
All tools exposed through the MCP server have minimal prompts. Any Agent interacting with these tools benefits from an additional system prompt.
View the recommended prompt here.
This prompt is also exposed as the deepset_recommended_prompt on the MCP server.
In Claude Desktop, click add from deepset to add the prompt to your context.
A better way to add system prompts in Claude Desktop is through "Projects".
You can customize the system prompt to your specific needs.
Use Cases
The primary way to use the deepset MCP server is through an LLM that interacts with the deepset MCP tools in an agentic way.
Creating Pipelines
Tell the LLM about the type of pipeline you want to build. Creating new pipelines will work best if you use terminology that is similar to what is used on the deepset AI platform or in Haystack.
Your prompts should be precise and specific.
Examples:
- "Build a RAG pipeline with hybrid retrieval that uses claude-sonnet-4 from Anthropic as the LLM."
- "Build an Agent that can iteratively search the web (deep research). Use SerperDev for web search and GPT-4o as the LLM."
You can also instruct the LLM to deploy pipelines, and it can issue search requests against pipelines to test them.
Best Practices
- be specific in your requests
- point the LLM to examples, if there is already a similar pipeline in your workspace, then ask it to look at it first, if you have a template in mind, ask it to look at the template
- instruct the LLM to iterate with you locally before creating the pipeline, have it validate the drafts and then let it create it once the pipeline is up to your standards
Debugging Pipelines
The deepset-mcp tools allow LLMs to debug pipelines on the deepset AI platform.
Primary tools used for debugging are:
- get_logs
- validate_pipeline
- search_pipeline
- search_pipeline_templates
- search_component_definition
You can ask the LLM to check the logs of a specific pipeline in case it is already deployed but has errors. The LLM will find errors in the logs and devise strategies to fix them. If your pipeline is not deployed yet, the LLM can autonomously validate it and fix validation errors.
CLI
You can use the MCP server as a Haystack Agent through a command-line interface.
Install with uvx tool install "deepset-mcp[cli]".
Start the interactive CLI with:
deepset agent chat
You can set environment variables before starting the Agent via:
export DEEPSET_API_KEY=your_key
export DEEPSET_WORKSPACE=your_workspace
You can also provide an .env file using the --env-file option:
deepset agent chat --env-file your/env/.file
The agent will load environment variables from the file on startup.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file deepset_mcp-0.0.3rc1.tar.gz.
File metadata
- Download URL: deepset_mcp-0.0.3rc1.tar.gz
- Upload date:
- Size: 25.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
88988d35cf822f894dcd6c88653837338f7d20841642bffd31670b3fe5399b79
|
|
| MD5 |
dd9bc272efb1582e9eb3f8700e6337d2
|
|
| BLAKE2b-256 |
25709702fdcc76596d2e9dd91f26c89ecde2ca44e0f3719476a63f7feada3bc0
|
Provenance
The following attestation bundles were made for deepset_mcp-0.0.3rc1.tar.gz:
Publisher:
pypi_release.yml on deepset-ai/deepset-mcp-server
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
deepset_mcp-0.0.3rc1.tar.gz -
Subject digest:
88988d35cf822f894dcd6c88653837338f7d20841642bffd31670b3fe5399b79 - Sigstore transparency entry: 267308935
- Sigstore integration time:
-
Permalink:
deepset-ai/deepset-mcp-server@8f060d2e8f6493d47e82c2e45e3eef049b86375c -
Branch / Tag:
refs/tags/v0.0.3-rc1 - Owner: https://github.com/deepset-ai
-
Access:
internal
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi_release.yml@8f060d2e8f6493d47e82c2e45e3eef049b86375c -
Trigger Event:
push
-
Statement type:
File details
Details for the file deepset_mcp-0.0.3rc1-py3-none-any.whl.
File metadata
- Download URL: deepset_mcp-0.0.3rc1-py3-none-any.whl
- Upload date:
- Size: 157.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
26500d9f8e0a3c3c6b9ac8861e50d2ef4dd3d81d833aeaebd3c77885020db8ed
|
|
| MD5 |
2139e0e6c7d706c04ef0588cdd237b25
|
|
| BLAKE2b-256 |
bcbaa847720c89235c13cea9faec188d855c61cad968be5ed9371f2e8ddce46d
|
Provenance
The following attestation bundles were made for deepset_mcp-0.0.3rc1-py3-none-any.whl:
Publisher:
pypi_release.yml on deepset-ai/deepset-mcp-server
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
deepset_mcp-0.0.3rc1-py3-none-any.whl -
Subject digest:
26500d9f8e0a3c3c6b9ac8861e50d2ef4dd3d81d833aeaebd3c77885020db8ed - Sigstore transparency entry: 267308942
- Sigstore integration time:
-
Permalink:
deepset-ai/deepset-mcp-server@8f060d2e8f6493d47e82c2e45e3eef049b86375c -
Branch / Tag:
refs/tags/v0.0.3-rc1 - Owner: https://github.com/deepset-ai
-
Access:
internal
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi_release.yml@8f060d2e8f6493d47e82c2e45e3eef049b86375c -
Trigger Event:
push
-
Statement type: