Skip to main content

A Model Context Protocol server providing intelligent context management and web content fetching capabilities for AI assistants

Project description

Context MCP Server

A Model Context Protocol (MCP) server that provides intelligent context management and web content fetching capabilities. This server enables AI assistants to efficiently store, retrieve, and manage contextual data while also fetching web content for real-time information access.

Features

  • 🔍 Smart Content Fetching: Retrieve web content using Jina Reader API with fallback mechanisms
  • 🌐 Web Content Processing: Convert HTML to markdown for better AI consumption
  • 💾 File Management: Save fetched content to organized file structures
  • 🚀 High Performance: Optimized fetching algorithms with intelligent caching
  • 🔧 Easy Integration: Standard MCP protocol compatibility with various AI clients

Available Tools

fetch

Fetches content from a URL and returns it as text. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.

Arguments:

  • url (string, required): The URL to fetch content from
  • max_length (integer, optional): Maximum number of characters to return (default: 5000)
  • start_index (integer, optional): Start content from this character index (default: 0)
  • raw (boolean, optional): Get raw content without markdown conversion (default: false)

Returns:

  • The content of the URL as text

Example usage:

Please fetch the content from https://example.com

fetch_and_save

Fetches content from a URL and saves it to a file. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.

Arguments:

  • url (string, required): The URL to fetch content from
  • file_path (string, optional): The path where to save the file. If not provided, a filename will be automatically generated based on the URL domain and timestamp
  • raw (boolean, optional): Get raw content without markdown conversion (default: false)

Returns:

  • The path where the file was saved

Example usage:

Please fetch and save the content from https://example.com to article.txt

Or with automatic naming:

Please fetch and save the content from https://example.com

Available Prompts

  • fetch
    • Fetch a URL and extract its contents as markdown
    • Arguments:
      • url (string, required): URL to fetch

Installation and Usage

Local Development Setup

  1. Clone or download the source code:

    git clone https://github.com/LangGPT/context-mcp-server.git
    cd context-mcp-server
    
  2. Install dependencies using uv:

    uv sync
    
  3. Test the server:

    uv run python -m context_mcp_server --help
    

Using with Claude Desktop (Local Source)

Add this configuration to your Claude Desktop config file:

{
  "mcpServers": {
    "context-mcp-server": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "/path/to/your/context-mcp-server",
        "python",
        "-m",
        "context_mcp_server"
      ],
      "env": {
        "CONTEXT_DIR": "/path/to/your/data/directory"
      }
    }
  }
}

Configuration file locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%/Claude/claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Using with VS Code (Local Source)

Add to your VS Code settings or .vscode/mcp.json:

{
  "mcpServers": {
    "context-mcp-server": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "/path/to/your/context-mcp-server",
        "python",
        "-m",
        "context_mcp_server"
      ],
      "env": {
        "CONTEXT_DIR": "/path/to/your/data/directory"
      }
    }
  }
}

Installation via Package Manager

Using uv (recommended)

When using uv no specific installation is needed. We will use uvx to directly run context-mcp-server:

uvx context-mcp-server

Using pip

pip install context-mcp-server

After installation, run it as:

python -m context_mcp_server

Package Manager Configuration

Claude Desktop with uvx

{
  "mcpServers": {
    "context-mcp-server": {
      "command": "uvx",
      "args": ["context-mcp-server"],
      "env": {
        "CONTEXT_DIR": "/path/to/your/data/directory"
      }
    }
  }
}

VS Code with uvx

{
  "mcp": {
    "servers": {
      "context-mcp-server": {
        "command": "uvx",
        "args": ["context-mcp-server"],
        "env": {
          "CONTEXT_DIR": "/path/to/your/data/directory"
        }
      }
    }
  }
}

Configuration

Environment Variables

CONTEXT_DIR

Sets the working directory where files will be saved when using the fetch_and_save tool.

  • Default: data
  • Priority: CONTEXT_DIR environment variable > default value data

Example:

export CONTEXT_DIR=/path/to/your/data

Command Line Arguments

--user-agent

By default, depending on if the request came from the model (via a tool), or was user initiated (via a prompt), the server will use either the user-agent:

ModelContextProtocol/1.0 (Autonomous; +https://github.com/modelcontextprotocol/servers)

or:

ModelContextProtocol/1.0 (User-Specified; +https://github.com/modelcontextprotocol/servers)

This can be customized by adding the argument --user-agent=YourUserAgent to the args list in the configuration.

--proxy-url

The server can be configured to use a proxy by using the --proxy-url argument.

Development

Setting up Development Environment

  1. Install development dependencies:

    uv sync --dev
    
  2. Run linting and type checking:

    uv run ruff check
    uv run pyright
    
  3. Build the package:

    uv build
    

Testing

Test the server locally:

uv run python -m context_mcp_server

With custom work directory:

CONTEXT_DIR=/custom/path uv run python -m context_mcp_server

Use the MCP inspector for debugging:

npx @modelcontextprotocol/inspector uv run python -m context_mcp_server

With custom work directory:

CONTEXT_DIR=/custom/path npx @modelcontextprotocol/inspector uv run python -m context_mcp_server

Making Changes

  1. Edit the source code in src/context_mcp_server/
  2. Test your changes with uv run python -m context_mcp_server
  3. Update version in pyproject.toml if needed
  4. Run tests and linting

Debugging

You can use the MCP inspector to debug the server:

For local development:

npx @modelcontextprotocol/inspector uv run python -m context_mcp_server

For uvx installations:

npx @modelcontextprotocol/inspector uvx context-mcp-server

Contributing

We encourage contributions to help expand and improve context-mcp-server. Whether you want to add new tools, enhance existing functionality, or improve documentation, your input is valuable.

License

context-mcp-server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

context_mcp_server-1.4.2.tar.gz (53.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

context_mcp_server-1.4.2-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file context_mcp_server-1.4.2.tar.gz.

File metadata

  • Download URL: context_mcp_server-1.4.2.tar.gz
  • Upload date:
  • Size: 53.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.11

File hashes

Hashes for context_mcp_server-1.4.2.tar.gz
Algorithm Hash digest
SHA256 56c5374c952d1af9a107f591d4492403363da5ae98e8667a76f72ec5ffeb0110
MD5 bf65ea4020550a5fc9049ccf8d30b003
BLAKE2b-256 b8541f66baa751c11c01ffddd6a0a66b9d4e1ebcabb1dda4a91580b2198b62fd

See more details on using hashes here.

File details

Details for the file context_mcp_server-1.4.2-py3-none-any.whl.

File metadata

File hashes

Hashes for context_mcp_server-1.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b4901e6b967ceb9087f995b25a7d6d8dff71ba332a13e41977915d925a036160
MD5 361147e205e43ada0479b81997c402f7
BLAKE2b-256 864ad879f080de3bbf50b3ea0d1640e96ae8b7b6162e33a2028a567673e408ae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page