MCP server that bridges Claude Code to Gemini CLI for workflow tasks like codebase analysis, specification creation, and code review
Project description
Gemini Workflow Bridge MCP
Gemini as Context Compression Engine + Claude as Reasoning Engine = A-Grade Results
Overview
This MCP is a context compression engine that optimally leverages both Claude Code and Gemini's strengths.
Key Features
- ✅ Quality: A-grade specifications (Gemini provides facts, Claude does reasoning)
- ✅ Cost: 47-61% reduction in Claude tokens (expensive operations move to free Gemini tier)
- ✅ Compression: 174:1 token compression ratio (50K tokens → 300 token summaries)
- ✅ DX: Auto-generated workflows and slash commands for common tasks
Architecture
┌─────────────────────────────────────────┐
│ Claude Code (Reasoning Engine) │
│ - Superior planning & specifications │
│ - Precise code editing │
│ - A-grade output quality │
└──────────────┬──────────────────────────┘
│ MCP Protocol
↓
┌─────────────────────────────────────────┐
│ MCP Server (Compression Layer) │
│ - 50K tokens → 300 token summaries │
│ - Fact extraction only │
│ - Validation & consistency checks │
└──────────────┬──────────────────────────┘
│ Gemini CLI
↓
┌─────────────────────────────────────────┐
│ Gemini (Context Engine) │
│ - 2M token window (free tier) │
│ - Factual extraction only │
│ - No opinions or planning │
└─────────────────────────────────────────┘
Installation
Prerequisites
-
Gemini CLI - Install and authenticate:
npm install -g @google/gemini-cli gemini # Follow authentication prompts
-
Python 3.11+ with pip
Install the MCP Server
# Clone the repository
git clone https://github.com/hitoshura25/gemini-workflow-bridge-mcp
cd gemini-workflow-bridge-mcp
# Install dependencies
pip install -e .
Configure Claude Code
Add to your Claude Code MCP settings (typically claude_desktop_config.json):
{
"mcpServers": {
"gemini-workflow-bridge": {
"command": "python",
"args": ["-m", "hitoshura25_gemini_workflow_bridge"],
"env": {
"CONTEXT_CACHE_TTL_MINUTES": "30",
"MAX_TOKENS_PER_ANSWER": "300",
"TARGET_COMPRESSION_RATIO": "100"
}
}
}
}
Quick Start
# 1. Extract facts about your codebase
query_codebase_tool(
questions=["How is authentication implemented?"],
scope="src/"
)
# Returns: Compressed facts with file:line references
# 2. Create specification using those facts (Claude does this)
# [Your reasoning creates A-grade spec here]
# 3. Validate specification
validate_against_codebase_tool(
spec_content="...",
validation_checks=["missing_files", "undefined_dependencies"]
)
# Returns: Completeness score, issues, suggestions
Documentation
- Full Documentation - You're reading it!
- Implementation Plan - Architecture details
- Configuration Guide - All configuration options
Tools Overview
🔍 Tier 1: Fact Extraction
| Tool | Purpose | Key Feature |
|---|---|---|
query_codebase_tool() |
Multi-question analysis | 174:1 compression ratio |
find_code_by_intent_tool() |
Semantic search | Returns summaries, not full code |
trace_feature_tool() |
Follow execution flow | Step-by-step with data flow |
list_error_patterns_tool() |
Extract patterns | Filtering at the edge |
✅ Tier 2: Validation
| Tool | Purpose |
|---|---|
validate_against_codebase_tool() |
Validate specs for completeness |
check_consistency_tool() |
Verify pattern alignment |
🚀 Tier 3: Workflow Automation
| Tool | Purpose |
|---|---|
generate_feature_workflow_tool() |
Generate executable workflows |
generate_slash_command_tool() |
Create custom slash commands |
Example: Complete Feature Implementation
User: "Add Redis caching to product API"
# Step 1: Extract facts
→ query_codebase_tool(questions=[...])
← 52K tokens → 387 tokens (134:1 compression)
# Step 2: Claude creates A-grade spec using facts
→ [Your superior reasoning]
← High-quality specification
# Step 3: Validate spec
→ validate_against_codebase_tool(spec=...)
← Completeness: 92%, 1 minor issue
# Step 4: Implement
→ [Your precise code editing]
Result: ✅ A-grade spec, 61% token savings, 3.5 minutes
How It Works
| Aspect | Description |
|---|---|
| Spec Creation | Claude generates A-grade specifications |
| Token Usage | 3,100 Claude tokens (61% reduction vs traditional approaches) |
| Gemini Role | Provides facts only |
| Claude Role | Creates from scratch with facts |
| Quality | A-grade |
| Workflows | Auto-generated |
Configuration
Key environment variables (see .env.example for all):
CONTEXT_CACHE_TTL_MINUTES=30 # Cache duration
MAX_TOKENS_PER_ANSWER=300 # Compression target
TARGET_COMPRESSION_RATIO=100 # Aim for 100:1
GEMINI_MODEL=auto # or specific model
Usage Example
# 1. Get facts
facts = query_codebase_tool(questions=[...])
# 2. Create spec (Claude does this with superior reasoning)
spec = create_your_a_grade_spec(facts)
# 3. Validate
validate_against_codebase_tool(spec=spec)
Troubleshooting
"Gemini CLI not found"
npm install -g @google/gemini-cli
"Empty response from Gemini"
gemini --version # Check installation
gemini # Re-authenticate if needed
Development
# Install dev dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run with debug logging
DEBUG_MODE=true python -m hitoshura25_gemini_workflow_bridge
Project Structure
hitoshura25_gemini_workflow_bridge/
├── tools/ # 8 tools (Tier 1, 2, 3)
├── prompts/ # Strict fact extraction prompts
├── workflows/ # Workflow templates
├── utils/ # Token counting, prompt loading
├── server.py # MCP server
└── generator.py # Legacy implementations
Success Metrics
- ✅ 61% cost reduction in Claude tokens
- ✅ 174:1 compression ratio (50K → 300 tokens)
- ✅ A-grade quality specifications
- ✅ Progressive disclosure with workflows
Contributing
Contributions welcome! Please read:
- Implementation Plan
- Architecture Overview
- Submit PR with tests
License
Apache 2.0 License - see LICENSE
Credits
- Architecture inspired by Gemini's analysis
- Based on Anthropic's MCP best practices
- Built with FastMCP
Status: ✅ Production Ready Last Updated: November 15, 2025
🌟 Star us on GitHub if you find this useful!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hitoshura25_gemini_workflow_bridge-0.3.3.tar.gz.
File metadata
- Download URL: hitoshura25_gemini_workflow_bridge-0.3.3.tar.gz
- Upload date:
- Size: 119.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7242d71b36e8accfde81eda0bc71c17f1943f60ddfd4fe7edf34af229ad61ff3
|
|
| MD5 |
155ddf185e6c450e812bf2094df5df8a
|
|
| BLAKE2b-256 |
619da03fcf62aeff4b3469dcdf3b1b555b335028550e1e4455e32b1558ef8820
|
Provenance
The following attestation bundles were made for hitoshura25_gemini_workflow_bridge-0.3.3.tar.gz:
Publisher:
release.yml on hitoshura25/gemini-workflow-bridge-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hitoshura25_gemini_workflow_bridge-0.3.3.tar.gz -
Subject digest:
7242d71b36e8accfde81eda0bc71c17f1943f60ddfd4fe7edf34af229ad61ff3 - Sigstore transparency entry: 702182994
- Sigstore integration time:
-
Permalink:
hitoshura25/gemini-workflow-bridge-mcp@ee5f889de78b83e296f42114e98f9be57515aff4 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/hitoshura25
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ee5f889de78b83e296f42114e98f9be57515aff4 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file hitoshura25_gemini_workflow_bridge-0.3.3-py3-none-any.whl.
File metadata
- Download URL: hitoshura25_gemini_workflow_bridge-0.3.3-py3-none-any.whl
- Upload date:
- Size: 64.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c069e37884821b03fe95adcc5b9ea0023b050db68d4297185b0110006eb13e5b
|
|
| MD5 |
78c9a8bf60bda17f768de684488bbe11
|
|
| BLAKE2b-256 |
cd4c1d6fd42f00811327f5986f444a0f8c5681cc40ac97efffc638722d339e45
|
Provenance
The following attestation bundles were made for hitoshura25_gemini_workflow_bridge-0.3.3-py3-none-any.whl:
Publisher:
release.yml on hitoshura25/gemini-workflow-bridge-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hitoshura25_gemini_workflow_bridge-0.3.3-py3-none-any.whl -
Subject digest:
c069e37884821b03fe95adcc5b9ea0023b050db68d4297185b0110006eb13e5b - Sigstore transparency entry: 702182995
- Sigstore integration time:
-
Permalink:
hitoshura25/gemini-workflow-bridge-mcp@ee5f889de78b83e296f42114e98f9be57515aff4 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/hitoshura25
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ee5f889de78b83e296f42114e98f9be57515aff4 -
Trigger Event:
workflow_dispatch
-
Statement type: