Recursive Meta-Cognition MCP Server - wraps prompts with meta-cognition instructions for AI code assistants
Project description
rmc-mcp
A Model Context Protocol (MCP) server that wraps prompts with recursive meta-cognition instructions for AI code assistants like Claude Code, Cursor, and GitHub Copilot.
What is Recursive Meta-Cognition?
Recursive meta-cognition is a prompting technique that instructs AI assistants to implement solutions through multiple layers of self-reflection. Instead of generating code in one pass, the AI:
- Breaks tasks into layers - Divides implementation into distinct phases
- Self-reflects after each layer - Evaluates what was done correctly, what edge cases are missing, and what could be improved
- Iteratively refines - Applies improvements before moving to the next layer
- Final comprehensive review - Performs a thorough review after all layers are complete
This approach produces more thoughtful, robust implementations by forcing the AI to pause and critically evaluate its own work.
Features
- Single MCP tool:
wrap_prompt- wraps any prompt with meta-cognition instructions - Configurable layers: 1-10 layers of recursive self-reflection (default: 3)
- Cost-effective: Uses DeepSeek API (significantly cheaper than OpenAI/Anthropic)
- Works with any AI assistant: Output can be used with Claude, Cursor, Copilot, ChatGPT, etc.
Prerequisites
- Python 3.10+
- uv package manager
- Claude Code CLI
- DeepSeek API key
Installation
1. Clone the repository
git clone https://github.com/gumruyanzh/rmc-mcp.git
cd rmc-mcp
2. Install dependencies
uv sync
3. Get your DeepSeek API key
- Go to DeepSeek Platform
- Create an account (if you don't have one)
- Generate a new API key
- Copy the key for the next step
4. Add to Claude Code
claude mcp add rmc-mcp \
-s user \
-e DEEPSEEK_API_KEY="your-api-key-here" \
-- uv run --directory /path/to/rmc-mcp rmc-mcp
Replace /path/to/rmc-mcp with the actual path where you cloned the repo.
5. Restart Claude Code
Exit and reopen Claude Code for the wrap_prompt tool to become available.
Usage
Once installed, use the wrap_prompt tool in Claude Code:
Basic usage
Use wrap_prompt: "Create a REST API for user authentication with JWT tokens"
With more layers for complex tasks
Use wrap_prompt with 5 layers: "Build a React dashboard with real-time data visualization, filtering, and export functionality"
With fewer layers for simple tasks
Use wrap_prompt with 2 layers: "Add input validation to the user registration form"
Tool Reference
wrap_prompt
Wraps a prompt with recursive meta-cognition instructions.
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
prompt |
string | Yes | - | The prompt to wrap with meta-cognition instructions |
layers |
integer | No | 3 | Number of meta-cognition layers (1-10) |
max_tokens |
integer | No | 2000 | Maximum tokens for the response |
Returns: A wrapped meta-prompt ready to use with any AI code assistant.
Example Output
When you call:
Use wrap_prompt: "Create a Python function that validates email addresses"
The tool returns a structured meta-prompt like:
**META-PROMPT: RECURSIVE META-COGNITION FOR CODE GENERATION**
You are to implement the following technical requirement using a structured,
self-reflective approach. Follow this exact process:
## LAYER BREAKDOWN
### Layer 1: Basic Structure & Core Validation
- Basic function signature and structure
- Core email format validation
- Simple regex or string-based validation
**SELF-REFLECTION AFTER LAYER 1:**
1. What was implemented correctly?
2. What edge cases might be missing?
3. What could be improved before proceeding?
### Layer 2: RFC-Compliant Validation Enhancement
...
### Layer 3: Production-Ready Enhancements
...
## FINAL COMPREHENSIVE REVIEW
...
You then use this output with any AI assistant to get a more thoughtful implementation.
How It Works
┌─────────────────────────────────────────────────────────────────┐
│ Your Prompt │
│ "Create a REST API for user authentication" │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ rmc-mcp Server │
│ 1. Takes your prompt │
│ 2. Sends to DeepSeek with meta-cognition template │
│ 3. Returns wrapped prompt with layer instructions │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Wrapped Meta-Prompt │
│ - Layer-based implementation plan │
│ - Self-reflection questions after each layer │
│ - Final review criteria │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Use with Any AI Assistant │
│ Claude Code, Cursor, Copilot, ChatGPT, etc. │
└─────────────────────────────────────────────────────────────────┘
Project Structure
rmc-mcp/
├── pyproject.toml # Project config + dependencies
├── src/
│ └── rmc_mcp/
│ ├── __init__.py # Package initialization
│ ├── server.py # MCP server with wrap_prompt tool
│ └── prompts.py # Meta-cognition prompt template
├── meta_prompt_wrapper.sh # Original shell script (reference)
└── README.md
Configuration Options
Environment Variables
| Variable | Required | Description |
|---|---|---|
DEEPSEEK_API_KEY |
Yes | Your DeepSeek API key |
MCP Server Scopes
You can install the server at different scopes:
User scope (recommended) - Available in all your projects:
claude mcp add rmc-mcp -s user -e DEEPSEEK_API_KEY="..." -- uv run --directory /path/to/rmc-mcp rmc-mcp
Project scope - Available only in a specific project:
claude mcp add rmc-mcp -s project -e DEEPSEEK_API_KEY="..." -- uv run --directory /path/to/rmc-mcp rmc-mcp
Troubleshooting
"DEEPSEEK_API_KEY not set" error
Make sure you included the -e DEEPSEEK_API_KEY="your-key" flag when adding the MCP server:
claude mcp add rmc-mcp -s user -e DEEPSEEK_API_KEY="your-key" -- uv run --directory /path/to/rmc-mcp rmc-mcp
Tool not appearing in Claude Code
- Make sure you ran
uv syncin the project directory - Restart Claude Code completely (exit and reopen)
- Check the MCP server is registered:
claude mcp list
Testing the server manually
# Should start and wait for stdio input (Ctrl+C to exit)
DEEPSEEK_API_KEY="your-key" uv run rmc-mcp
Testing with MCP Inspector
npx @anthropic-ai/mcp-inspector uv run --directory /path/to/rmc-mcp rmc-mcp
Why DeepSeek?
This tool uses DeepSeek instead of OpenAI or Anthropic APIs because:
- Cost-effective: DeepSeek is significantly cheaper per token
- Quality: DeepSeek-chat produces high-quality prompt transformations
- OpenAI-compatible API: Easy to integrate using the OpenAI Python SDK
You can get a DeepSeek API key at https://platform.deepseek.com/api_keys
License
MIT
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file rmc_mcp-0.1.0.tar.gz.
File metadata
- Download URL: rmc_mcp-0.1.0.tar.gz
- Upload date:
- Size: 64.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
74b641e602f9540cb20ea7f31e23f5ec68caf6c92efe9c8478be8849e6a4c940
|
|
| MD5 |
f85a8843097f9c230fab3f280a968a71
|
|
| BLAKE2b-256 |
df8e90fd11de2727844aaa0801340652aa5471c393c3059b1102f2c2ce82b2cb
|
File details
Details for the file rmc_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: rmc_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 7.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2c33440deceb0ace30827aade7164b132974db6346ae8a239708c3aa409912fb
|
|
| MD5 |
302794d2888be2b6f3c6b2686aece53d
|
|
| BLAKE2b-256 |
79b8151a3be65ab7a59fd2be6988596f5354b0fb0801587366996863a7e759bf
|