Human-In-the-Loop MCP Server - Enable AI assistants to interact with humans through GUI dialogs
Project description
Human-In-the-Loop MCP Server
A powerful Model Context Protocol (MCP) Server that enables AI assistants to interact with humans through intuitive GUI dialogs. This server bridges the gap between automated AI processes and human decision-making by providing tools for real-time user input, choices, confirmations, and feedback.
🚀 Features
💬 Interactive Dialog Tools
- Text Input: Get text, numbers, or other data from users
- Multiple Choice: Present options for single or multiple selections
- Multi-line Input: Collect longer text content, code, or detailed descriptions
- Confirmation Dialogs: Ask for yes/no decisions before proceeding
- Information Messages: Display notifications and status updates
- LLM Guidance: Built-in prompting guidance for when to use human-in-the-loop tools
🖥️ Cross-Platform Support
- Windows: Modern Windows 11-style GUI with beautiful styling, hover effects, and enhanced visual design
- macOS: Native macOS experience with SF Pro Display fonts and proper window management
- Linux: Ubuntu-compatible GUI with modern styling and system fonts
⚡ Advanced Features
- Non-blocking Operation: All dialogs run in separate threads
- Timeout Protection: Configurable timeouts prevent hanging
- Platform Detection: Automatic optimization for each operating system
- Modern UI Design: Beautiful Windows 11-style interface with smooth animations and hover effects
- Error Handling: Comprehensive error reporting and recovery
- Health Monitoring: Built-in health check and status reporting
- Keyboard Shortcuts: Full keyboard navigation support
🎨 Modern Visual Design
The Human-In-the-Loop MCP Server features a beautiful, modern interface designed for the current era:
Windows 11-Style Interface
- Clean, modern color scheme with Windows 11-inspired design language
- Smooth hover effects on buttons and interactive elements
- Enhanced typography using Segoe UI and Consolas fonts
- Consistent spacing and improved visual hierarchy
- Professional appearance suitable for business and development environments
Cross-Platform Consistency
- Platform-specific fonts and styling (SF Pro on macOS, Segoe UI on Windows, Ubuntu on Linux)
- Adaptive sizing and positioning based on screen characteristics
- Native-feeling interfaces that respect platform conventions
Enhanced User Experience
- Keyboard shortcuts for all dialogs (Enter to confirm, Escape to cancel)
- Improved focus management with logical tab order
- Better accessibility with high contrast colors and clear visual feedback
- Responsive design that works on different screen sizes
📦 Installation
Prerequisites
- Python 3.8 or higher
- tkinter (usually included with Python)
- pip package manager
Quick Start
-
Clone the repository:
git clone https://github.com/GongRzhe/Human-In-the-Loop-MCP-Server.git cd Human-In-the-Loop-MCP-Server
-
Create a virtual environment (recommended):
python -m venv .venv # Windows .venv\Scripts\activate # macOS/Linux source .venv/bin/activate
-
Install dependencies:
pip install fastmcp pydantic
-
Run the server:
python human_loop_server.py
Platform-Specific Setup
macOS
- Grant Python accessibility permissions in System Preferences > Security & Privacy > Accessibility
- This allows proper window focus and app activation
Windows
- No additional setup required
- Windows Defender may prompt for network access permission
Linux
- Ensure tkinter is installed:
sudo apt-get install python3-tk(Ubuntu/Debian) - Some distributions may require additional GUI libraries
🛠️ Usage
Basic Integration
The server provides several MCP tools that can be used by AI assistants:
Get User Input
# Request text input from user
result = await get_user_input(
title="User Information",
prompt="Please enter your name:",
default_value="",
input_type="text"
)
Get User Choice
# Present multiple options
result = await get_user_choice(
title="Select Option",
prompt="Choose your preferred programming language:",
choices=["Python", "JavaScript", "Java", "C++"],
allow_multiple=False
)
Multi-line Input
# Collect longer text content
result = await get_multiline_input(
title="Code Review",
prompt="Please provide your code review comments:",
default_value=""
)
Confirmation Dialog
# Ask for confirmation
result = await show_confirmation_dialog(
title="Confirm Action",
message="Are you sure you want to delete this file?"
)
Information Message
# Display information
result = await show_info_message(
title="Process Complete",
message="Your file has been successfully processed!"
)
Get LLM Guidance
# Get comprehensive guidance on when to use human-in-the-loop tools
guidance = await get_human_loop_prompt()
# Returns structured guidance with examples and best practices
Response Format
All tools return structured responses:
{
"success": true,
"user_input": "User's response",
"cancelled": false,
"platform": "darwin",
"input_type": "text"
}
Health Check
Monitor server status:
status = await health_check()
# Returns detailed platform and status information
🧠 LLM Integration Guidance
The server includes built-in prompting guidance to help AI assistants understand when and how to use human-in-the-loop tools effectively:
guidance = await get_human_loop_prompt()
This tool returns comprehensive guidance including:
- When to use each tool type
- Best practices for user interaction
- Decision framework for human-in-the-loop integration
- Example scenarios and usage patterns
- Workflow integration tips
Smart Decision Making
The guidance helps LLMs make intelligent decisions about when to pause for human input:
- Ambiguous requirements → Ask for clarification
- Creative decisions → Get user preferences
- Destructive actions → Require confirmation
- Missing information → Request specific details
- Long processes → Provide status updates
This ensures human interaction is meaningful and well-timed, enhancing rather than interrupting the user experience.
🔧 Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
HILMCP_TIMEOUT |
Dialog timeout in seconds | 300 |
HILMCP_FONT_SIZE |
UI font size | Platform-specific |
Customization
You can modify the server by:
- Changing Fonts: Edit
get_system_font()function - Window Sizes: Modify geometry settings in dialog classes
- Timeouts: Adjust timeout values in tool functions
- Platform Behavior: Customize platform-specific configurations
📋 API Reference
Tools
| Tool | Description |
|---|---|
get_user_input |
Single-line text/number input |
get_user_choice |
Multiple choice selection |
get_multiline_input |
Multi-line text input |
show_confirmation_dialog |
Yes/No confirmation |
show_info_message |
Information display |
get_human_loop_prompt |
Get guidance on when to use human-in-the-loop tools |
health_check |
Server status check |
Parameters
get_user_input
title(str): Dialog window titleprompt(str): Question/prompt textdefault_value(str): Pre-filled valueinput_type(str): "text", "integer", or "float"
get_user_choice
title(str): Dialog window titleprompt(str): Question/prompt textchoices(List[str]): Available optionsallow_multiple(bool): Allow multiple selections
get_multiline_input
title(str): Dialog window titleprompt(str): Question/prompt textdefault_value(str): Pre-filled text
show_confirmation_dialog
title(str): Dialog window titlemessage(str): Confirmation message
show_info_message
title(str): Dialog window titlemessage(str): Information message
get_human_loop_prompt
- No parameters required
- Returns comprehensive guidance for LLMs on when and how to use human-in-the-loop tools
🔍 Troubleshooting
Common Issues
GUI Not Appearing
- Check if GUI environment is available
- Verify tkinter installation
- Run health check to diagnose issues
Permission Errors (macOS)
- Grant accessibility permissions in System Preferences
- Restart terminal after granting permissions
Import Errors
- Ensure virtual environment is activated
- Install dependencies:
pip install fastmcp pydantic
Dialog Timeout
- Increase timeout value in environment variables
- Check if user interaction is required
Visual Issues on Windows
- Ensure you're running on Windows 10/11 for best visual experience
- Some styling features may be limited on older Windows versions
- Update your graphics drivers if experiencing display issues
Debug Mode
Enable debug logging:
python human_loop_server.py --debug
🏗️ Development
Project Structure
Human-In-the-Loop-MCP-Server/
├── human_loop_server.py # Main server implementation
├── .gitignore # Git ignore file
├── .venv/ # Virtual environment
└── README.md # This file
Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make your changes
- Add tests if applicable
- Commit your changes:
git commit -am 'Add feature' - Push to the branch:
git push origin feature-name - Submit a pull request
Code Style
- Follow PEP 8 guidelines
- Use type hints
- Add docstrings for functions and classes
- Include error handling
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🤝 Acknowledgments
- Built with FastMCP framework
- Uses Pydantic for data validation
- Cross-platform GUI powered by tkinter
🔗 Links
- Repository: https://github.com/GongRzhe/Human-In-the-Loop-MCP-Server
- Issues: Report bugs or request features
- MCP Protocol: Learn about Model Context Protocol
📊 Usage Examples
Example 1: Collecting User Preferences
# Get user's preferred settings
preferences = await get_user_choice(
title="Setup Preferences",
prompt="Select your preferred theme:",
choices=["Dark", "Light", "Auto"],
allow_multiple=False
)
# Configure based on user choice
if preferences["selected_choice"] == "Dark":
apply_dark_theme()
Example 2: Code Review Workflow
# Get code for review
code = await get_multiline_input(
title="Code Review",
prompt="Paste the code you want reviewed:",
default_value=""
)
# Process the code
analysis = analyze_code(code["user_input"])
# Show results
await show_info_message(
title="Review Complete",
message=f"Analysis complete. Found {len(analysis.issues)} issues."
)
Example 3: Confirmation Before Action
# Confirm before destructive action
confirmation = await show_confirmation_dialog(
title="Delete Confirmation",
message="This will permanently delete all selected files. Continue?"
)
if confirmation["confirmed"]:
delete_files()
await show_info_message("Success", "Files deleted successfully!")
else:
await show_info_message("Cancelled", "Operation cancelled by user.")
Example 4: LLM Using Built-in Guidance
# LLM gets guidance on when to use human-in-the-loop tools
guidance = await get_human_loop_prompt()
# LLM analyzes the situation using the decision framework
user_request = "Create a backup of my important files"
# Following guidance: "ambiguous requirements" → ask for clarification
file_location = await get_user_input(
title="Backup Configuration",
prompt="Which directory contains your important files?",
default_value="~/Documents"
)
backup_format = await get_user_choice(
title="Backup Options",
prompt="Choose backup format:",
choices=["Full Backup", "Compressed Archive", "Sync Copy"]
)
# Following guidance: "destructive actions" → require confirmation
if backup_format["selected_choice"] == "Full Backup":
confirmed = await show_confirmation_dialog(
title="Confirm Backup",
message=f"Create full backup of {file_location['user_input']}? This may take several minutes."
)
if confirmed["confirmed"]:
# Perform backup with status updates
await show_info_message("Backup Started", "Creating backup... Please wait.")
# ... backup process ...
await show_info_message("Success", "Backup completed successfully!")
Made with ❤️ for the AI community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file human_in_the_loop_mcp-0.1.0.tar.gz.
File metadata
- Download URL: human_in_the_loop_mcp-0.1.0.tar.gz
- Upload date:
- Size: 17.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
32e2df93c80ba15d62df3e5f630861ea32d93f01ab7d0c4f5923b5c2550aeb61
|
|
| MD5 |
1f50015fa64c4c8273dff02e22670156
|
|
| BLAKE2b-256 |
44ecbd9603b57b4cb2a552af2ac88f43fbb57f54cbafe34e565f145c81cea214
|
File details
Details for the file human_in_the_loop_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: human_in_the_loop_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 31.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2656f7e4d8cf1424f0a39f1b56cd067978d2f6f3fe85c5398bbba79ad4f39eb7
|
|
| MD5 |
a386513d0bec1bc4e90164e08831e1c2
|
|
| BLAKE2b-256 |
f41e0bafc14fb63850a34b7c89c7e9085e9d1c67ef9daee204e0133926ebb863
|