Ask Human for Context MCP Server - GUI dialogs for AI assistant interaction
Project description
Ask Human for Context MCP Server
Bridge the gap between AI and human intelligence - A Model Context Protocol (MCP) server that enables AI assistants to ask humans for missing context during conversations and development workflows.
๐ค What is this?
The Ask Human for Context MCP Server is a specialized tool that allows AI assistants (like Claude in Cursor) to pause their workflow and ask you directly for clarification, preferences, or missing information through native GUI dialogs.
The Problem It Solves
When AI assistants encounter situations where they need human input to proceed effectively, they typically either:
- Make assumptions that might be wrong
- Ask generic questions in the chat
- Get stuck without clear direction
This MCP server enables true human-in-the-loop workflows where the AI can:
- โ Pause and ask for specific clarification
- โ Present context about why information is needed
- โ Get immediate, focused responses through native dialogs
- โ Continue with confidence based on your input
๐ฏ Use Cases
Perfect for scenarios where AI needs human guidance:
- Multiple Implementation Approaches: "Should I use React or Vue for this component?"
- Technology Preferences: "Which database would you prefer: PostgreSQL or MongoDB?"
- Domain-Specific Requirements: "What's the maximum file size for uploads in your system?"
- User Experience Decisions: "How should we handle errors - modal dialogs or inline messages?"
- Code Architecture: "Should this be a microservice or part of the monolith?"
- Missing Context: "What's the expected behavior when the API is down?"
๐ Quick Start with Cursor
1. Add to Cursor MCP Configuration
Add this to your Cursor MCP settings (~/.cursor/mcp.json):
{
"mcpServers": {
"ask-human-for-context": {
"command": "uvx",
"args": ["ask-human-for-context-mcp", "--transport", "stdio"]
}
}
}
Note: No manual installation needed! uvx automatically downloads and runs the package from PyPI.
2. Restart Cursor
The MCP server will now be available to Claude in your Cursor sessions!
๐ฌ How It Works
For AI Assistants
When Claude (or another AI) needs human input, it can call the asking_user_missing_context tool:
# AI calls this tool when it needs clarification
asking_user_missing_context(
question="Should I implement authentication using JWT tokens or session cookies?",
context="I'm building the login system for your web app. Both approaches are valid, but they have different security and performance trade-offs."
)
For Humans
You'll see a native dialog box like this:
๐ Missing Context:
I'm building the login system for your web app. Both approaches are
valid, but they have different security and performance trade-offs.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Question:
Should I implement authentication using JWT tokens or session cookies?
[Text Input Field]
[ OK ] [ Cancel ]
Your response gets sent back to the AI to continue the workflow.
๐ฅ๏ธ Platform Support
Cross-Platform Native Dialogs
| Platform | Technology | Features |
|---|---|---|
| macOS | osascript |
Custom Cursor icon, 90-second timeout |
| Linux | zenity |
Custom window icon, proper styling |
| Windows | tkinter |
Native Windows dialogs |
Automatic Fallbacks
- Graceful error handling if GUI systems aren't available
- Clear error messages with troubleshooting guidance
- No crashes or hanging - always responds to the AI
๐ง Installation Options
Option 1: uvx (Recommended - Production Ready)
Simply add to your Cursor MCP configuration - no manual installation required:
{
"ask-human-for-context": {
"command": "uvx",
"args": ["ask-human-for-context-mcp", "--transport", "stdio"]
}
}
โจ Auto-Install: uvx automatically downloads the latest version from PyPI!
๐ Auto-Update: uvx handles version management and updates seamlessly.
Option 2: pip + Virtual Environment
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install package
pip install ask-human-for-context-mcp
# Add to Cursor config
{
"ask-human-for-context": {
"command": "/path/to/venv/bin/ask-human-for-context-mcp",
"args": ["--transport", "stdio"]
}
}
Option 3: Development Installation
# Clone and install for development
git clone https://github.com/galperetz/ask-human-for-context-mcp.git
cd ask-human-for-context-mcp
pip install -e .
# Use in Cursor config (local development)
{
"ask-human-for-context": {
"command": "uvx",
"args": ["--from", "/path/to/project", "ask-human-for-context-mcp", "--transport", "stdio"]
}
}
Note: For production use, prefer Option 1 which uses the published PyPI package.
โ๏ธ Configuration
Transport Modes
STDIO (Default)
Perfect for MCP clients like Cursor:
ask-human-for-context-mcp --transport stdio
SSE (Server-Sent Events)
For web applications:
ask-human-for-context-mcp --transport sse --host 0.0.0.0 --port 8080
Timeout Settings
- Default timeout: 90 seconds (1.5 minutes)
- Configurable range: 30 seconds to 2 hours
- User-friendly: Shows timeout in minutes for better UX
๐ Tool Reference
asking_user_missing_context
Ask the user for missing context during AI workflows.
Parameters:
question(string, required): The specific question (max 1000 chars)context(string, optional): Background explaining why context is needed (max 2000 chars)
Returns:
โ User response: [user's answer]- When user provides inputโ ๏ธ Empty response received- When user clicks OK without entering textโ ๏ธ Timeout: No response within [time]- When dialog times outโ ๏ธ Cancelled: User cancelled the prompt- When user cancels dialogโ Error: [description]- When there are validation or system errors
Example Usage:
# Simple question
result = asking_user_missing_context(
question="What's the preferred color scheme for the UI?"
)
# Question with context
result = asking_user_missing_context(
question="Should I use REST or GraphQL for the API?",
context="I'm designing the backend architecture. The frontend will need to fetch user data, posts, and comments. Performance and caching are important considerations."
)
๐ ๏ธ Development
Requirements
- Python 3.8+
- Dependencies:
mcp - Platform-specific:
osascript(macOS),zenity(Linux),tkinter(Windows)
Building
# Install dependencies
pip install -e .
# Build package
uv build
# Run tests
pytest
Project Structure
ask-human-for-context-mcp/
โโโ src/ask_human_for_context_mcp/
โ โโโ __init__.py
โ โโโ __main__.py
โ โโโ server.py # Main MCP server implementation
โโโ assets/
โ โโโ cursor-icon.icns # Custom Cursor icon for dialogs
โโโ pyproject.toml # Project configuration
โโโ README.md
๐ค Integration Examples
Cursor AI Development Workflow
- AI encounters decision point: "I need to choose between TypeScript and JavaScript"
- AI calls the tool: Provides context about the project and asks for preference
- User sees dialog: Native popup with formatted question and context
- User responds: Types preference and clicks OK
- AI continues: Uses the human input to make informed decisions
Perfect for:
- Code reviews: "Should I refactor this function or leave it as-is?"
- Architecture decisions: "Microservices or monolith for this feature?"
- UI/UX choices: "Modal dialog or inline editing for this form?"
- Technology selection: "Which CSS framework fits your preferences?"
๐ Security & Privacy
- Local execution: All dialogs run locally on your machine
- No data collection: No user responses are logged or transmitted
- Secure communication: Uses MCP's secure transport protocols
- Timeout protection: Automatic cleanup prevents hanging processes
๐ License
MIT License - see LICENSE file for details.
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
๐ Support
- Issues: GitHub Issues
- Documentation: Model Context Protocol
- MCP Community: MCP Discussions
Made with โค๏ธ for better human-AI collaboration
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ask_human_for_context_mcp-1.0.2.tar.gz.
File metadata
- Download URL: ask_human_for_context_mcp-1.0.2.tar.gz
- Upload date:
- Size: 5.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c7c0163e071e65c6aa83d5f3d633d3f4377e298a374ec52562a2bd8302dc32e9
|
|
| MD5 |
71e4dc6a488cb8c3079e83f5432ea05e
|
|
| BLAKE2b-256 |
7f9c03b03477954b0fdca0c9c1d1e270585b0169c70cd5e097fc373602314873
|
File details
Details for the file ask_human_for_context_mcp-1.0.2-py3-none-any.whl.
File metadata
- Download URL: ask_human_for_context_mcp-1.0.2-py3-none-any.whl
- Upload date:
- Size: 12.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8caf108b5bd6a191b18482543201f23a29858e09bf9d81a7d25390a0a907d106
|
|
| MD5 |
9c7519a730163e69ea30b94adb445013
|
|
| BLAKE2b-256 |
ab305e60ee2309353d46c2abe21f2f4392ea03c145e1a396f0002da90e5b5eb9
|