Skip to main content

A Model Context Protocol (MCP) server that provides RAG capabilities using Contextual AI

Project description

Contextual MCP Server

A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Contextual AI. This server integrates with a variety of MCP clients. It provides flexibility in you can decide what functionality to offer in the server. In this readme, we will show integration with the both Cursor IDE and Claude Desktop.

Contextual AI now offers a hosted server inside the platform available at: https://mcp.app.contextual.ai/mcp/
After you connect to the server, you can use the tools, such as query, provided by the platform MCP server.
For a complete walkthrough, check out the MCP user guide.

Overview

An MCP server acts as a bridge between AI interfaces (Cursor IDE or Claude Desktop) and a specialized Contextual AI agent. It enables:

  1. Query Processing: Direct your domain specific questions to a dedicated Contextual AI agent
  2. Intelligent Retrieval: Searches through comprehensive information in your knowledge base
  3. Context-Aware Responses: Generates answers that are:
  • Grounded in source documentation
  • Include citations and attributions
  • Maintain conversation context

Integration Flow

Cursor/Claude Desktop → MCP Server → Contextual AI RAG Agent
        ↑                  ↓             ↓                         
        └──────────────────┴─────────────┴─────────────── Response with citations

Prerequisites

  • Python 3.10 or higher
  • Cursor IDE and/or Claude Desktop
  • Contextual AI API key
  • MCP-compatible environment

Installation

  1. Clone the repository:
git clone https://github.com/ContextualAI/contextual-mcp-server.git
cd contextual-mcp-server
  1. Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows, use `.venv\Scripts\activate`
  1. Install dependencies:
pip install -e .

Configuration

Configure MCP Server

The server requires modifications of settings or use. For example, the single_agent server should be customized with an appropriate docstring for your RAG Agent.

The docstring for your query tool is critical as it helps the MCP client understand when to route questions to your RAG agent. Make it specific to your knowledge domain. Here is an example:

A research tool focused on financial data on the largest US firms

or

A research tool focused on technical documents for Omaha semiconductors

The server also requires the following settings from your RAG Agent:

  • API_KEY: Your Contextual AI API key
  • AGENT_ID: Your Contextual AI agent ID

If you'd like to store these files in .env file you can specify them like so:

cat > .env << EOF
API_KEY=key...
AGENT_ID=...
EOF

The repo also contains more advance MPC servers for multi-agent systems or a document-agent.

AI Interface Integration

This MCP server can be integrated with a variety of clients. To use with either Cursor IDE or Claude Desktop create or modify the MCP configuration file in the appropriate location:

  1. First, find the path to your uv installation:
UV_PATH=$(which uv)
echo $UV_PATH
# Example output: /Users/username/miniconda3/bin/uv
  1. Create the configuration file using the full path from step 1:
cat > mcp.json << EOF
{
 "mcpServers": {
   "ContextualAI-TechDocs": {
     "command": "$UV_PATH", # make sure this is set properly
     "args": [
       "--directory",
       "\${workspaceFolder}",  # Will be replaced with your project path
       "run",
       "multi-agent/server.py"
     ]
   }
 }
}
EOF
  1. Move to the correct folder location, see below for options:
mkdir -p .cursor/
mv mcp.json .cursor/

Configuration locations:

  • For Cursor:
  • Project-specific: .cursor/mcp.json in your project directory
  • Global: ~/.cursor/mcp.json for system-wide access
  • For Claude Desktop:
  • Use the same configuration file format in the appropriate Claude Desktop configuration directory

Environment Setup

This project uses uv for dependency management, which provides faster and more reliable Python package installation.

Usage

The server provides Contextual AI RAG capabilities using the python SDK, which can available a variety of commands accessible from MCP clients, such as Cursor IDE and Claude Desktop. The current server focuses on using the query command from the Contextual AI python SDK, however you could extend this to support other features such as listing all the agents, updating retrieval settings, updating prompts, extracting retrievals, or downloading metrics.

Example Usage

# In Cursor, you might ask:
"Show me the code for initiating the RF345 microchip?"

# The MCP client will:
1. Determine if this should be routed to the MCP Server

# Then the MCP server will:
1. Route the query to the Contextual AI agent
2. Retrieve relevant documentation
3. Generate a response with specific citations
4. Return the formatted answer to Cursor

Key Benefits

  1. Accurate Responses: All answers are grounded in your documentation
  2. Source Attribution: Every response includes references to source documents
  3. Context Awareness: The system maintains conversation context for follow-up questions
  4. Real-time Updates: Responses reflect the latest documentation in your datastore

Development

Modifying the Server

To add new capabilities:

  1. Add new tools by creating additional functions decorated with @mcp.tool()
  2. Define the tool's parameters using Python type hints
  3. Provide a clear docstring describing the tool's functionality

Example:

@mcp.tool()
def new_tool(param: str) -> str:
   """Description of what the tool does"""
   # Implementation
   return result

Limitations

  • The server runs locally and may not work in remote development environments
  • Tool responses are subject to Contextual AI API limits and quotas
  • Currently only supports stdio transport mode

For all the capabilities of Contextual AI, please check the official documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_contextualai_contextual_mcp_server-0.1.0.tar.gz.

File metadata

  • Download URL: iflow_mcp_contextualai_contextual_mcp_server-0.1.0.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_contextualai_contextual_mcp_server-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1f17a0952db2b9ad19874b93c8f69083d74a0f94223d4cd2b94aec4826fa0c1a
MD5 4ac9f511180a0a0f4e15249fd2f607c0
BLAKE2b-256 60c125c9255a8a2da09a74b8d6cb0d636b0d1c3bece2e8b5eec8572d4cacb3f4

See more details on using hashes here.

File details

Details for the file iflow_mcp_contextualai_contextual_mcp_server-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_contextualai_contextual_mcp_server-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_contextualai_contextual_mcp_server-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 666885a03ae3b7da3dab9e6ca5be226871b98f4d4deaacb5677b0cbffb4622cb
MD5 5cd91cb5394b4357fe2ed1bd1678aa5f
BLAKE2b-256 96aff955cdb9819177e2f76e30552839964c23ec854d80504f9161c6f82166b2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page