Skip to main content

A Model Context Protocol server for analyzing large codebases using Gemini 2.0

Project description

DeepView MCP

DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini 2.5 Pro's extensive context window.

PyPI version

Features

  • Load an entire codebase from a single text file (e.g., created with tools like repomix)
  • Query the codebase using Gemini 2.5 Pro's large context window (up to 2M tokens)
  • Connect to IDEs that support the MCP protocol, like Cursor and Windsurf
  • Configurable Gemini model selection via command-line arguments

Prerequisites

Installation

Using pip

pip install deepview-mcp

Using Poetry

poetry add deepview-mcp

From Source

  1. Clone the repository:

    git clone https://github.com/ddegtyarev/deepview-mcp.git
    cd deepview-mcp
    
  2. Install with Poetry:

    poetry install
    
  3. Set up your environment variables:

    cp .env.example .env
    

    Then edit the .env file and add your Gemini API key.

Usage

Starting the Server

Run the server with:

# Basic usage with default settings
deepview-mcp [path/to/codebase.txt]

# Specify a different Gemini model
deepview-mcp [path/to/codebase.txt] --model gemini-2.0-pro

# Change log level
deepview-mcp [path/to/codebase.txt] --log-level DEBUG

The codebase file parameter is optional. If not provided, you'll need to specify it when making queries.

Command-line Options

  • --model MODEL: Specify the Gemini model to use (default: gemini-2.5-pro-exp-03-25)
  • --log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}: Set the logging level (default: INFO)

Using with an IDE

Cursor

  1. Open Cursor's settings
  2. Find the MCP section and add a new server
  3. Use a config like: {"command": "deepview-mcp", "args": ["path/to/codebase.txt"]}
  4. Restart Cursor to connect

Windsurf

  1. Open Windsurf's settings
  2. Navigate to the MCP configuration
  3. Add a new MCP server with the following configuration:
    {
      "mcpServers": {
        "deepview": {
          "command": "/path/to/deepview-mcp",
          "args": ["/path/to/codebase.txt"],
          "env": {
            "GEMINI_API_KEY": "your_gemini_api_key"
          }
        }
      }
    }
    
  4. Restart Windsurf to connect

Available Tools

The server provides one tool:

  1. deepview: Ask a question about the codebase
    • Required parameter: question - The question to ask about the codebase
    • Optional parameter: codebase_file - Path to a codebase file to load before querying

Preparing Your Codebase

DeepView MCP works best with a single file containing your entire codebase. You can use repomix to prepare your codebase in an AI-friendly format.

Using repomix

  1. Basic Usage: Run repomix in your project directory to create a default output file:
# Make sure you're using Node.js 18.17.0 or higher
npx repomix

This will generate a repomix-output.xml file containing your codebase.

  1. Custom Configuration: Create a configuration file to customize which files get packaged and the output format:
npx repomix --init

This creates a repomix.config.json file that you can edit to:

  • Include/exclude specific files or directories
  • Change the output format (XML, JSON, TXT)
  • Set the output filename
  • Configure other packaging options
  1. Loading the Codebase: Use the generated file with DeepView MCP:
deepview-mcp /path/to/repomix-output.xml

Example repomix Configuration

Here's an example repomix.config.json file:

{
  "include": [
    "**/*.py",
    "**/*.js",
    "**/*.ts",
    "**/*.jsx",
    "**/*.tsx"
  ],
  "exclude": [
    "node_modules/**",
    "venv/**",
    "**/__pycache__/**",
    "**/test/**"
  ],
  "output": {
    "format": "xml",
    "filename": "my-codebase.xml"
  }
}

For more information on repomix, visit the repomix GitHub repository.

License

MIT

Author

Dmitry Degtyarev (ddegtyarev@gmail.com)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepview_mcp-0.2.0.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

deepview_mcp-0.2.0-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file deepview_mcp-0.2.0.tar.gz.

File metadata

  • Download URL: deepview_mcp-0.2.0.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for deepview_mcp-0.2.0.tar.gz
Algorithm Hash digest
SHA256 2bcc371d6b207820ce9e078443cc48c54557ac7ae91d8bc3b7acd2c23504a49d
MD5 96b8452848878724b7dd8794913850bb
BLAKE2b-256 5de6b0de213f8bcc081a9a48d153738899c7e7939c9db7325ceee9c4e13c3f7e

See more details on using hashes here.

File details

Details for the file deepview_mcp-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: deepview_mcp-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 8.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for deepview_mcp-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 54d22834d8e2b39fd864eba5fd07f51d8f0103d263b78da705ea759a4940605b
MD5 60558d0808992ad1ee9b23b4639f663c
BLAKE2b-256 11d6879c76f0010269c7767b2d3027fe145beb4b53a98417aac3cdd9cd8c67de

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page