MCP server that bridges Claude Code to Gemini CLI for workflow tasks like codebase analysis, specification creation, and code review
Project description
Gemini Workflow Bridge MCP Server
MCP server that bridges Claude Code to Gemini CLI for workflow tasks like codebase analysis, specification creation, and code review, leveraging Gemini's massive 2M token context window and cost-effectiveness for read-heavy operations.
Overview
This MCP server extends Claude Code's capabilities by providing tools that delegate specific workflow tasks to Google's Gemini 2.0 Flash model. It's designed to optimize your development workflow by using each AI model's strengths:
- Gemini: Heavy context loading, codebase analysis, spec generation (2M token window)
- Claude Code: Precise code editing, implementation, and orchestration
Why CLI-Based?
CLI-Only Design (v1.0.0+): This server uses the Gemini CLI instead of API calls. Key benefits:
- Zero API Costs: Uses your existing Gemini Code Assist subscription
- Simple Auth: Reuses your CLI credentials, no API key management
- No Extra Setup: If you have Gemini CLI installed, you're ready to go
- Same Power: Access to all Gemini models including 2.0 Flash
Perfect for developers who already have Gemini Code Assist!
Features
Tools
analyze_codebase_with_gemini- Analyze large codebases using Gemini's 2M token contextcreate_specification_with_gemini- Generate detailed technical specificationsreview_code_with_gemini- Comprehensive code review with multiple focus areasgenerate_documentation_with_gemini- Create documentation with full codebase contextask_gemini- General-purpose queries with optional codebase context
Resources
workflow://specs/{name}- Access saved specificationsworkflow://reviews/{name}- Access saved code reviewsworkflow://context/{name}- Access cached codebase analysis
Installation
Prerequisites
- Python 3.11+
- Gemini CLI installed and authenticated
Step 1: Install Gemini CLI
npm install -g @google/gemini-cli
Step 2: Authenticate Gemini CLI
gemini
# Follow the authentication prompts
# Your credentials will be cached automatically
Step 3: Install MCP Server via pip
pip install hitoshura25-gemini-workflow-bridge
Install via uvx (recommended)
uvx hitoshura25-gemini-workflow-bridge
Configuration
Verify Gemini CLI is Ready
# Check CLI is installed
gemini --version
# Should show: 0.13.0 or higher
# Test CLI works
echo "What is 2+2?" | gemini
# Should return a response from Gemini
Optional: Configure Model
Create a .env file (optional):
# NO API KEY NEEDED!
# Use "auto" to let the CLI choose the best model automatically
# Pro models for complex tasks, Flash for simple/fast tasks
GEMINI_MODEL=auto
# Or specify a specific model:
# GEMINI_MODEL=gemini-2.0-flash
# GEMINI_MODEL=gemini-1.5-pro
DEFAULT_SPEC_DIR=./specs
DEFAULT_REVIEW_DIR=./reviews
DEFAULT_CONTEXT_DIR=./.workflow-context
See .env.example for all available options.
Configure Claude Code
Add the server to your Claude Code MCP configuration (~/.claude/config.json or workspace .claude/config.json):
Using uvx (recommended):
{
"mcpServers": {
"gemini-workflow": {
"command": "uvx",
"args": ["hitoshura25-gemini-workflow-bridge"]
}
}
}
Or using pip:
{
"mcpServers": {
"gemini-workflow": {
"command": "python",
"args": ["-m", "hitoshura25_gemini_workflow_bridge.server"]
}
}
}
Note: No API key needed in config! The MCP server uses your Gemini CLI credentials automatically.
Usage Examples
Example 1: Analyze Codebase
Claude Code can use this to analyze your codebase before implementing a feature:
User: "I want to add Redis caching to the product catalog API"
Claude Code (internally):
[Calls: analyze_codebase_with_gemini({
focus_description: "product catalog API structure and caching opportunities",
file_patterns: ["*.py", "*.js"],
directories: ["src/api", "src/services"]
})]
Response:
{
"analysis": "The product catalog API is implemented in...",
"architecture_summary": "Microservices architecture with...",
"relevant_files": ["src/api/catalog.py", "src/services/product_service.py"],
"cached_context_id": "ctx_abc123"
}
Example 2: Generate Specification
User: "Create a detailed spec for the Redis caching feature"
Claude Code (internally):
[Calls: create_specification_with_gemini({
feature_description: "Redis caching for product catalog API",
context_id: "ctx_abc123", // Reuse cached analysis
spec_template: "standard"
})]
Response:
{
"spec_path": "./specs/redis-caching-for-product-catalog-api-spec.md",
"implementation_tasks": [
{"task": "Install redis-py dependency", "order": 1},
{"task": "Create cache middleware", "order": 2},
...
],
"estimated_complexity": "medium"
}
Example 3: Code Review
User: "Review my changes before I commit"
Claude Code (internally):
[Calls: review_code_with_gemini({
review_focus: ["security", "performance"],
spec_path: "./specs/redis-caching-spec.md"
})]
Response:
{
"review_path": "./reviews/2025-01-10-123456-review.md",
"has_blocking_issues": false,
"summary": "Code looks good overall. Consider adding connection pooling."
}
Example 4: Generate Documentation
User: "Generate API documentation for the catalog service"
Claude Code (internally):
[Calls: generate_documentation_with_gemini({
documentation_type: "api",
scope: "product catalog service",
include_examples: true
})]
Response:
{
"doc_path": "./docs/api-documentation.md",
"word_count": 2500
}
Example 5: Ask Gemini
User: "Ask Gemini about the best caching strategy for this codebase"
Claude Code (internally):
[Calls: ask_gemini({
prompt: "What's the best caching strategy for this product catalog API?",
include_codebase_context: true,
temperature: 0.7
})]
Response:
{
"response": "Based on your codebase architecture, I recommend...",
"context_used": true
}
Tool Reference
analyze_codebase_with_gemini
Analyze codebase using Gemini's 2M token context window.
Parameters:
focus_description(string, required): What to focus on in the analysisdirectories(array, optional): Directories to analyzefile_patterns(array, optional): File patterns to include (default:["*.py", "*.js", "*.ts", "*.java", "*.go"])exclude_patterns(array, optional): Patterns to exclude (default:["node_modules/", "dist/", "build/"])
Returns:
{
"analysis": "Detailed analysis text",
"architecture_summary": "High-level overview",
"relevant_files": ["file1.py", "file2.js"],
"patterns_identified": ["pattern1", "pattern2"],
"integration_points": ["point1", "point2"],
"cached_context_id": "ctx_abc123"
}
create_specification_with_gemini
Generate detailed technical specification.
Parameters:
feature_description(string, required): What feature to specifycontext_id(string, optional): Context ID from previous analysisspec_template(string, optional): Template to use ("standard", "minimal", "comprehensive")output_path(string, optional): Where to save the spec
Returns:
{
"spec_path": "./specs/feature-spec.md",
"spec_content": "Full markdown content",
"implementation_tasks": [{"task": "...", "order": 1}],
"estimated_complexity": "medium",
"files_to_modify": ["file1.py"],
"files_to_create": ["file2.py"]
}
review_code_with_gemini
Comprehensive code review.
Parameters:
files(array, optional): Files to review (default: git diff)review_focus(array, optional): Focus areas (default:["security", "performance", "best-practices", "testing"])spec_path(string, optional): Specification to review againstoutput_path(string, optional): Where to save review
Returns:
{
"review_path": "./reviews/2025-01-10-review.md",
"review_content": "Full markdown review",
"issues_found": [{
"severity": "high",
"category": "security",
"file": "auth.py",
"line": 42,
"issue": "Potential SQL injection",
"suggestion": "Use parameterized queries"
}],
"has_blocking_issues": true,
"summary": "Review summary"
}
generate_documentation_with_gemini
Generate comprehensive documentation.
Parameters:
documentation_type(string, required): Type ("api", "architecture", "user-guide", "readme", "contributing")scope(string, required): What to documentoutput_path(string, optional): Where to save documentationinclude_examples(boolean, optional): Include code examples (default: true)
Returns:
{
"doc_path": "./docs/api-documentation.md",
"doc_content": "Full markdown documentation",
"sections": ["overview", "endpoints", "examples"],
"word_count": 2500
}
ask_gemini
General-purpose Gemini query.
Parameters:
prompt(string, required): Question or taskinclude_codebase_context(boolean, optional): Load full codebase (default: false)context_id(string, optional): Reuse cached contexttemperature(number, optional): Generation temperature 0.0-1.0 (default: 0.7)
Returns:
{
"response": "Gemini's response",
"context_used": true,
"token_count": 150000
}
Architecture
┌─────────────────────────────────────────────────────┐
│ Claude Code CLI │
│ (Orchestrator - makes all decisions) │
│ │
│ "Let me analyze the codebase with Gemini..." │
│ [Invokes: analyze_codebase_with_gemini] │
└──────────────────────┬──────────────────────────────┘
│ MCP Protocol
↓
┌─────────────────────────────────────────────────────┐
│ MCP Server: gemini-workflow-bridge │
│ │
│ Tools: │
│ • analyze_codebase_with_gemini │
│ • create_specification_with_gemini │
│ • review_code_with_gemini │
│ • generate_documentation_with_gemini │
│ • ask_gemini │
│ │
│ Resources: │
│ • workflow://specs/{feature-name} │
│ • workflow://reviews/{review-id} │
│ • workflow://context/{project-name} │
└──────────────────────┬──────────────────────────────┘
│ Gemini API
↓
┌─────────────────────────────────────────────────────┐
│ Google Gemini 2.0 Flash │
│ (2M token context, fast, cost-effective) │
└─────────────────────────────────────────────────────┘
Development
Setup Development Environment
# Clone the repository
git clone https://github.com/hitoshura25/gemini-workflow-bridge-mcp
cd gemini-workflow-bridge-mcp
# Create virtual environment
python -m venv venv
source venv/bin/activate # or venv\Scripts\activate on Windows
# Install in development mode
pip install -e ".[dev]"
# Copy environment template (optional)
cp .env.example .env
# Edit .env to customize model or directories if needed
Run Tests
pytest
Run Linting
ruff check .
mypy .
License
Apache-2.0 License - see LICENSE for details.
Credits
- Built with MCP
- Powered by Google Gemini 2.0 Flash
- Generated with mcp-server-generator
Support
- Issues: GitHub Issues
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hitoshura25_gemini_workflow_bridge-0.0.2.tar.gz.
File metadata
- Download URL: hitoshura25_gemini_workflow_bridge-0.0.2.tar.gz
- Upload date:
- Size: 55.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f51e29d870953f3952c6187f3c36ae29371ea85010b7a8e37996d18bf3a7c735
|
|
| MD5 |
fdacedaca1657113fcfa04af8e6d09e9
|
|
| BLAKE2b-256 |
fa0d8e11d9e340360eb88df1a21f378cc548dee210ff10fb0b10ba15cc796280
|
Provenance
The following attestation bundles were made for hitoshura25_gemini_workflow_bridge-0.0.2.tar.gz:
Publisher:
release.yml on hitoshura25/gemini-workflow-bridge-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hitoshura25_gemini_workflow_bridge-0.0.2.tar.gz -
Subject digest:
f51e29d870953f3952c6187f3c36ae29371ea85010b7a8e37996d18bf3a7c735 - Sigstore transparency entry: 692807479
- Sigstore integration time:
-
Permalink:
hitoshura25/gemini-workflow-bridge-mcp@bee4c0310a7baf8e54c22c31a59d87aa3e6e5b15 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/hitoshura25
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@bee4c0310a7baf8e54c22c31a59d87aa3e6e5b15 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file hitoshura25_gemini_workflow_bridge-0.0.2-py3-none-any.whl.
File metadata
- Download URL: hitoshura25_gemini_workflow_bridge-0.0.2-py3-none-any.whl
- Upload date:
- Size: 28.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
66b1319eafa2c2fc65d1b7bb15cd9b14cff3966f0664c19067f5ed18883826fe
|
|
| MD5 |
7a6d03e6a2a4c174036805fc141df673
|
|
| BLAKE2b-256 |
d3bdd898bd2dd978deeefa9e0654f7c2b23d476bede6e22eaa3418568b3b2d93
|
Provenance
The following attestation bundles were made for hitoshura25_gemini_workflow_bridge-0.0.2-py3-none-any.whl:
Publisher:
release.yml on hitoshura25/gemini-workflow-bridge-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hitoshura25_gemini_workflow_bridge-0.0.2-py3-none-any.whl -
Subject digest:
66b1319eafa2c2fc65d1b7bb15cd9b14cff3966f0664c19067f5ed18883826fe - Sigstore transparency entry: 692807498
- Sigstore integration time:
-
Permalink:
hitoshura25/gemini-workflow-bridge-mcp@bee4c0310a7baf8e54c22c31a59d87aa3e6e5b15 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/hitoshura25
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@bee4c0310a7baf8e54c22c31a59d87aa3e6e5b15 -
Trigger Event:
workflow_dispatch
-
Statement type: