Convert any application into an MCP server — with AI assistance. CLI + Claude Desktop plugin.
Project description
🔨 MCP Forge
Convert any application into an MCP (Model Context Protocol) server — with AI assistance.
MCP Forge is a self-hosted AI agent that analyzes your existing app (via OpenAPI spec, GitHub repo, live URL, or local code) and generates a production-ready MCP server that Claude Desktop, Claude Code, or any MCP client can use directly.
✨ Features
| Category | Highlights |
|---|---|
| Source ingestion | OpenAPI/Swagger URL · GitHub repo · Live URL probing · Local folder (mnt/) · File upload · Manual description |
| AI agent | Multi-LLM (Gemini · Anthropic · OpenAI · local HuggingFace) · per-project chat · clarification Q&A loop |
| Code generation | Python FastMCP · Node.js · Go · Generic · LLM polish pass · security audit |
| Versioning | Snapshot on every generation · one-click rollback · optional git commits |
| Testing | AI-generated pytest cases · in-container runner · full test history |
| Dashboard | Real-time logs · 6-tab project view · editable .env config from the browser |
| Claude / Codex | Claude Desktop plugin (stdio + SSE) · Claude Code plugin (marketplace + .mcp.json) · Codex plugin · forge CLI |
🚀 Quick Start
1 — Get the code & configure
git clone https://github.com/coderXcode/mcp-forge.git
cd mcp-forge
cp .env.example .env
Open .env and fill in at least one LLM key or use local model as listed in below sections:
LLM_PROVIDER=gemini # or: anthropic | openai | local
GEMINI_API_KEY=your-key-here
MCP_AUTH_TOKEN=change-me-to-something-secret # auth token for Claude/Codex
Free option: Gemini has a free tier at aistudio.google.com.
2 — Start
docker compose up -d
| Service | URL |
|---|---|
| 🌐 Dashboard | http://localhost:8000 |
| 🔌 MCP endpoint | http://localhost:8001/sse |
3 — Open the dashboard
Visit http://localhost:8000 → click + New Project to begin.
🤖 Integrate with Claude Desktop / Claude Code
For full step-by-step instructions see user_manual.md.
Claude Desktop (one-command):
# Windows
.\scripts\install_claude_plugin.ps1
# macOS / Linux
bash scripts/install_claude_plugin.sh
Claude Code (one-liner):
/plugin marketplace add coderXcode/mcp-forge
forge CLI:
pip install mcpforge
🖥️ Local Model (No API Key)
Run entirely offline using any HuggingFace model — no API key required. Requires an NVIDIA GPU.
LLM_PROVIDER=local
LOCAL_MODEL=Qwen/Qwen2.5-Coder-14B-Instruct # or any HuggingFace model ID
LOCAL_MODEL_DEVICE=auto
LOCAL_MODEL_LOAD_IN_4BIT=true # 4-bit NF4 quantization — fits on 8 GB VRAM
Rebuild once after changing these settings:
docker compose down && docker compose build && docker compose up -d
The model downloads from HuggingFace on first use and is cached for future runs. You can replace LOCAL_MODEL with any HuggingFace model that supports chat/instruction format — some well-tested options:
| Model | VRAM (4-bit) | Notes |
|---|---|---|
Qwen/Qwen2.5-Coder-7B-Instruct |
~4 GB | Lightest option |
Qwen/Qwen2.5-Coder-14B-Instruct |
~8 GB | Recommended |
deepseek-ai/deepseek-coder-v2-lite-instruct |
~8 GB | Strong alternative |
Qwen/Qwen2.5-Coder-32B-Instruct |
~18 GB | Best quality |
mistralai/Mistral-7B-Instruct-v0.3 |
~4 GB | General purpose |
Set
LOCAL_MODEL_LOAD_IN_4BIT=falseandLOCAL_MODEL_DEVICE=cputo run on CPU (slow but no GPU needed).
See user_manual.md for full setup details including NVIDIA Container Toolkit requirements.
⚙️ Key Configuration
All settings live in .env (also editable live from the dashboard Config page).
| Variable | Default | Description |
|---|---|---|
LLM_PROVIDER |
gemini |
gemini | anthropic | openai | local |
GEMINI_API_KEY |
— | Google Gemini API key |
ANTHROPIC_API_KEY |
— | Anthropic Claude API key |
OPENAI_API_KEY |
— | OpenAI API key |
MCP_AUTH_TOKEN |
change-me |
Auth token for Claude / Codex — change this |
GITHUB_TOKEN |
— | PAT for private GitHub repos |
ENABLE_GIT_SNAPSHOTS |
false |
Auto-commit each snapshot to git |
DEBUG |
false |
Verbose logs + uvicorn reload |
🐳 Useful Docker Commands
docker compose up -d # start
docker compose up -d --build # rebuild after code changes
docker compose restart # restart after .env changes
docker compose down -v # stop + wipe database
docker logs mcp_forge_app -f # app logs
docker logs mcp_forge_mcp -f # MCP server logs
📖 Documentation
| Document | What's in it |
|---|---|
| user_manual.md | Full setup · Claude Desktop · Claude Code · Codex · forge CLI · troubleshooting · architecture |
📝 License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_forger-1.0.0.tar.gz.
File metadata
- Download URL: mcp_forger-1.0.0.tar.gz
- Upload date:
- Size: 327.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d6f4107b0aecfcd630c7914f3a49e948807304bfc67a35ce7d5ec4817305313f
|
|
| MD5 |
88c5b604016c5c8c23591b27b1e90a0e
|
|
| BLAKE2b-256 |
96894f416154716b1d03c4f5c0a68e82fe75e33dd9c4f5e977beb663dae74b38
|
File details
Details for the file mcp_forger-1.0.0-py3-none-any.whl.
File metadata
- Download URL: mcp_forger-1.0.0-py3-none-any.whl
- Upload date:
- Size: 7.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad7cc00a26f2b139c3dbe3c72841e3f83ffa65ef903c7574db65346212cbd565
|
|
| MD5 |
8806f69f1d637a004d67b88a002e093b
|
|
| BLAKE2b-256 |
b82d4a07c2e775b185dddd7eaadc58b44c74d58ec68b4fa9c8d23796faf9f0ab
|