Skip to main content

MCP server for AI-powered research using Gemini: quick grounded search + Deep Research Agent

Project description

Gemini Research MCP Server

PyPI version Python 3.12+ License: MIT

MCP server for AI-powered research using Gemini. Fast grounded search + comprehensive Deep Research + session management.

Architecture

flowchart TB
    subgraph Client["MCP Client"]
        Claude["Claude / Copilot"]
    end

    subgraph Server["gemini-research-mcp"]
        direction TB
        FastMCP["FastMCP Server<br/>@mcp.tool()"]
        
        subgraph Tools["Tools"]
            RW["research_web<br/>Quick lookup 5-30s"]
            RD["research_deep<br/>Autonomous 3-20min"]
            RF["research_followup<br/>Continue session"]
            RR["resume_research<br/>Recover interrupted"]
            FW["fetch_webpage<br/>Content extraction"]
            EX["export_research_session<br/>MD/JSON/DOCX"]
            LS["list_research_sessions"]
        end

        subgraph Modules["Core Modules"]
            Quick["quick.py<br/>Web grounding"]
            Deep["deep.py<br/>Deep research agent"]
            Content["content.py<br/>SSRF protection"]
            Sessions["sessions.py<br/>Session manager"]
        end
    end

    subgraph External["External Services"]
        Gemini["Google Gemini API<br/>2.0 Flash"]
        Web["Web Sources<br/>via trafilatura"]
    end

    subgraph Storage["Persistence"]
        SQLite["SQLite<br/>~/.gemini-research/"]
    end

    Claude -->|"MCP Protocol"| FastMCP
    FastMCP --> Tools
    
    RW --> Quick
    RD --> Deep
    RF --> Sessions
    RR --> Sessions
    FW --> Content
    
    Quick -->|"grounding"| Gemini
    Deep -->|"agentic"| Gemini
    Content -->|"httpx"| Web
    Sessions --> SQLite

Tools

Tool Description Latency
research_web Fast web search with citations 5-30 sec
research_deep Multi-step autonomous research 3-20 min
resume_research Resume interrupted/in-progress sessions instant
research_followup Continue conversation after research 5-30 sec
list_research_sessions List saved research sessions instant
export_research_session Export to Markdown, JSON, or DOCX instant
fetch_webpage Extract article content (SSRF-protected) 0.5-2 sec

Power User Workflow

Power User Workflow

Key insight: Gemini Deep Research runs asynchronously on Google's servers. Even if VS Code disconnects, your research continues. The resume_research tool retrieves completed work.

Features

  • Auto-Clarification: research_deep asks clarifying questions for vague queries via MCP Elicitation
  • MCP Tasks: Real-time progress with streaming updates
  • Session Persistence: Research sessions are automatically saved and can be resumed later
  • Export Formats: Export to Markdown, JSON, or professional DOCX with Table of Contents
  • File Search: Search your own data alongside web using file_search_store_names
  • Format Instructions: Control report structure (sections, tables, tone)

Installation

PyPI (recommended)

pip install gemini-research-mcp
# or
uv add gemini-research-mcp

Claude Desktop (MCPB Bundle)

Download the .mcpb bundle from GitHub Releases and open it in Claude Desktop for single-click installation.

The bundle uses UV runtime - dependencies are installed automatically, no Python required.

Configuration

Variable Required Default Description
GEMINI_API_KEY Yes Google AI Studio API key
GEMINI_MODEL No gemini-3-flash-preview Model for research_web
GEMINI_SUMMARY_MODEL No gemini-3-flash-preview Model for session summaries (fast)
DEEP_RESEARCH_AGENT No deep-research-pro-preview-12-2025 Agent for research_deep
cp .env.example .env
# Edit .env with your API key

Usage

VS Code MCP

Add to .vscode/mcp.json:

{
  "servers": {
    "gemini-research": {
      "command": "uvx",
      "args": ["gemini-research-mcp"],
      "env": {
        "GEMINI_API_KEY": "your-api-key"
      }
    }
  }
}

Or run from source:

{
  "servers": {
    "gemini-research": {
      "command": "uv",
      "args": ["run", "--directory", "path/to/gemini-research-mcp", "gemini-research-mcp"],
      "envFile": "${workspaceFolder}/path/to/gemini-research-mcp/.env"
    }
  }
}

Command Line

uv run gemini-research-mcp
# or
uvx gemini-research-mcp

DOCX Export

Export research sessions to professional Word documents with:

  • Cover page with title, date, and research metadata
  • Clickable Table of Contents with navigation to sections
  • Professional typography: Calibri fonts, 1-inch margins, 1.5x line spacing
  • Executive summary with elegant formatting
  • Full research report with proper heading hierarchy
  • Sources section with full clickable URLs
  • Metadata table with session details

VS Code Setup

To enable DOCX export, install with the [docx] extra:

{
  "servers": {
    "gemini-research": {
      "command": "uvx",
      "args": ["--from", "gemini-research-mcp[docx]", "gemini-research-mcp"],
      "env": {
        "GEMINI_API_KEY": "your-api-key"
      }
    }
  }
}

Downloading Files

After running export_research_session with format: "docx", the tool returns a resource URI:

research://exports/{export_id}

In VS Code Copilot Chat, you can:

  • Click "Save" on the resource attachment to download the .docx file
  • Drag-and-drop from the chat into your workspace

Installation (pip/uv)

# Install with DOCX support
pip install 'gemini-research-mcp[docx]'
# or
uv add 'gemini-research-mcp[docx]'

Features

Feature Description
Cover Page Title, date, duration, tokens, AI agent
Clickable TOC Internal hyperlinks navigate to sections
Syntax Highlighting Pygments-powered code blocks with GitHub colors
Professional Styling Calibri fonts, proper heading hierarchy (H1-H4)
Page Margins Standard 1-inch (2.54cm) margins
Heading Spacing keep_with_next prevents orphan headings
Sources Full URLs as clickable hyperlinks
Pure Python No external binaries (Pandoc not required)

Resources

MCP Resources provide read-only data that clients can access:

Resource Description
research://models Available models and their capabilities
research://exports List cached exports ready for download
research://exports/{id} Download an exported file (Markdown, JSON, or DOCX)

File Downloads

The export_research_session tool creates exports and returns a resource URI. Clients (like VS Code) can then fetch the resource to download the file with proper MIME type handling.

Development

uv sync --extra dev
uv run pytest
uv run mypy src/
uv run ruff check src/

Tests

uv run pytest                    # Unit tests
uv run pytest -m e2e             # E2E tests (requires GEMINI_API_KEY)
uv run pytest --cov=src/gemini_research_mcp  # With coverage

Pricing

Tool Typical Cost
research_web ~$0.01-0.05 per query
research_deep ~$2-5 per task

Deep Research uses ~80-160 searches and ~250k-900k tokens per task.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gemini_research_mcp-0.10.0.tar.gz (25.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gemini_research_mcp-0.10.0-py3-none-any.whl (76.3 kB view details)

Uploaded Python 3

File details

Details for the file gemini_research_mcp-0.10.0.tar.gz.

File metadata

  • Download URL: gemini_research_mcp-0.10.0.tar.gz
  • Upload date:
  • Size: 25.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gemini_research_mcp-0.10.0.tar.gz
Algorithm Hash digest
SHA256 d65bb9ccbff671f1148498f9bbb41c94b314a9b7225e41b3bdf1f41bfec3c3c6
MD5 b8001a0243fe7a91fa93743db9c58d24
BLAKE2b-256 fb376f2c1f4094ad57c5298020b4eb7b7c3e9aa16d7cd20567e59811f565012e

See more details on using hashes here.

Provenance

The following attestation bundles were made for gemini_research_mcp-0.10.0.tar.gz:

Publisher: publish.yml on machinemates-ai/gemini-research-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gemini_research_mcp-0.10.0-py3-none-any.whl.

File metadata

File hashes

Hashes for gemini_research_mcp-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa80dca4e9eb77f6fc59373c6be8d34f9a71acb227d4c12bb4759d330e51ed35
MD5 c7f73372f41235424b27b27024e282a4
BLAKE2b-256 29eb200934a0c97f24fd7d9bf76ab6e2ba797612e8ab56133e2e24ff98585d1a

See more details on using hashes here.

Provenance

The following attestation bundles were made for gemini_research_mcp-0.10.0-py3-none-any.whl:

Publisher: publish.yml on machinemates-ai/gemini-research-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page