Skip to main content

Enterprise-grade MCP server with sequential thinking, project convention learning, and intelligent memory management

Project description

Enhanced MCP Memory

⚡ Optimized for Claude Sonnet 4 - This MCP server works best with Claude Sonnet 4 for optimal performance and AI-powered features.

An enhanced MCP (Model Context Protocol) server for intelligent memory and task management, designed for AI assistants and development workflows. Features semantic search, automatic task extraction, knowledge graphs, and comprehensive project management.

✨ Key Features

🧠 Intelligent Memory Management

  • Semantic search using sentence-transformers for natural language queries
  • Automatic memory classification with importance scoring
  • Duplicate detection and content deduplication
  • File path associations for code-memory relationships
  • Knowledge graph relationships with automatic similarity detection

🧬 Sequential Thinking Engine

  • Structured reasoning chains with 5-stage process (analysis, planning, execution, validation, reflection)
  • Context management with automatic token optimization
  • Conversation continuity across sessions with intelligent summarization
  • Real-time token estimation and compression (30-70% reduction)
  • Auto-extraction of key points, decisions, and action items

📋 Advanced Task Management

  • Auto-task extraction from conversations and code comments
  • Priority and category management with validation
  • Status tracking (pending, in_progress, completed, cancelled)
  • Task-memory relationships in knowledge graph
  • Project-based organization
  • Complex task decomposition into manageable subtasks

🏗️ Project Convention Learning

  • Automatic environment detection - OS, shell, tools, and runtime versions
  • Project type recognition - Node.js, Python, Rust, Go, Java, MCP servers, etc.
  • Command pattern learning - Extracts npm scripts, Makefile targets, and project commands
  • Tool configuration detection - IDEs, linters, CI/CD, build tools, and testing frameworks
  • Dependency management - Package managers, lock files, and installation commands
  • Smart command suggestions - Corrects user commands based on project conventions
  • Windows-specific optimizations - Proper path separators and command formats
  • Memory integration - Stores learned conventions for AI context and future reference

📊 Performance Monitoring

  • Performance monitoring with detailed metrics
  • Health checks and system diagnostics
  • Automatic cleanup of old data and duplicates
  • Database optimization tools
  • Comprehensive logging and error tracking
  • Token usage analytics and optimization recommendations

� Enterprise Features

  • Performance monitoring with detailed metrics
  • Health checks and system diagnostics
  • Automatic cleanup of old data and duplicates
  • Database optimization tools
  • Comprehensive logging and error tracking
  • Token usage analytics and optimization recommendations

�🚀 Easy Deployment

  • uvx compatible for one-command installation
  • Zero-configuration startup with sensible defaults
  • Environment variable configuration
  • Cross-platform support (Windows, macOS, Linux)

🏗️ Project Structure

enhanced-mcp-memory/
├── mcp_server_enhanced.py    # Main MCP server with FastMCP integration
├── memory_manager.py         # Core memory/task logic and project detection
├── sequential_thinking.py    # Thinking chains and context optimization
├── database.py              # Database operations with retry mechanisms
├── requirements.txt         # Python dependencies
├── setup.py                # Package configuration
├── data/                   # SQLite database storage
├── logs/                   # Application logs

🚀 Quick Start

Option 1: Using uvx (Recommended)

# Install and run with uvx
uvx enhanced-mcp-memory

Option 2: Manual Installation

# Clone and install
git clone https://github.com/cbunting99/enhanced-mcp-memory.git
cd enhanced-mcp-memory
pip install -e .

# Run the server
enhanced-mcp-memory

Option 3: Development Setup

# Clone repository
git clone https://github.com/cbunting99/enhanced-mcp-memory.git
cd enhanced-mcp-memory

# Install dependencies
pip install -r requirements.txt

# Run directly
python mcp_server_enhanced.py

⚙️ MCP Configuration

Add to your MCP client configuration:

For uvx installation:

{
  "mcpServers": {
    "memory-manager": {
      "command": "uvx",
      "args": ["enhanced-mcp-memory"],
      "env": {
        "LOG_LEVEL": "INFO",
        "MAX_MEMORY_ITEMS": "1000",
        "ENABLE_AUTO_CLEANUP": "true"
      }
    }
  }
}

For local installation:

{
  "mcpServers": {
    "memory-manager": {
      "command": "python",
      "args": ["mcp_server_enhanced.py"],
      "cwd": "/path/to/enhanced-mcp-memory",
      "env": {
        "LOG_LEVEL": "INFO",
        "MAX_MEMORY_ITEMS": "1000",
        "ENABLE_AUTO_CLEANUP": "true"
      }
    }
  }
}

🛠️ Available Tools

Core Memory Tools

  • get_memory_context(query) - Get relevant memories and context
  • create_task(title, description, priority, category) - Create new tasks
  • get_tasks(status, limit) - Retrieve tasks with filtering
  • get_project_summary() - Get comprehensive project overview

Sequential Thinking Tools

  • start_thinking_chain(objective) - Begin structured reasoning process
  • add_thinking_step(chain_id, stage, title, content, reasoning) - Add reasoning steps
  • get_thinking_chain(chain_id) - Retrieve complete thinking chain
  • list_thinking_chains(limit) - List recent thinking chains

Context Management Tools

  • create_context_summary(content, key_points, decisions, actions) - Compress context for token optimization
  • start_new_chat_session(title, objective, continue_from) - Begin new conversation with optional continuation
  • consolidate_current_session() - Compress current session for handoff
  • get_optimized_context(max_tokens) - Get token-optimized context
  • estimate_token_usage(text) - Estimate token count for planning

Enterprise Auto-Processing

  • auto_process_conversation(content, interaction_type) - Extract memories and tasks automatically
  • decompose_task(prompt) - Break complex tasks into subtasks

Project Convention Tools

  • auto_learn_project_conventions(project_path) - Automatically detect and learn project patterns
  • get_project_conventions_summary() - Get formatted summary of learned conventions
  • suggest_correct_command(user_command) - Suggest project-appropriate command corrections
  • remember_project_pattern(pattern_type, pattern, description) - Manually store project patterns
  • update_memory_context() - Refresh memory context with latest project conventions

System Management Tools

  • health_check() - Check server health and connectivity
  • get_performance_stats() - Get detailed performance metrics
  • cleanup_old_data(days_old) - Clean up old memories and tasks
  • optimize_memories() - Remove duplicates and optimize storage
  • get_database_stats() - Get comprehensive database statistics

🏗️ Project Convention Learning

The Enhanced MCP Memory Server automatically learns and remembers project-specific conventions to prevent AI assistants from suggesting incorrect commands or approaches:

Automatic Detection

  • Operating System: Windows vs Unix, preferred shell and commands
  • Project Type: Node.js, Python, Rust, Go, Java, MCP servers, FastAPI, Django
  • Development Tools: IDEs, linters, formatters, CI/CD configurations
  • Package Management: npm, yarn, pip, poetry, cargo, go modules
  • Build Systems: Vite, Webpack, Make, batch scripts, shell scripts

Smart Command Suggestions

# Instead of generic commands, suggests project-specific ones:
User types: "node server.js"
AI suggests: "Use 'npm run dev' instead for this project"

User types: "python main.py" 
AI suggests: "Use 'uvicorn main:app --reload' for this FastAPI project"

Windows Optimization

  • Automatically detects Windows environment
  • Uses cmd.exe and Windows-appropriate path separators
  • Suggests Windows-compatible commands (e.g., dir instead of ls)
  • Handles Windows-specific Python and Node.js patterns

Memory Integration

All learned conventions are stored as high-importance memories that:

  • Appear in AI context for every interaction
  • Persist across sessions and project switches
  • Include environment warnings and project-specific guidance
  • Prevent repeated incorrect command suggestions

🔧 Configuration Options

Configure via environment variables:

Variable Default Description
LOG_LEVEL INFO Logging level (DEBUG, INFO, WARNING, ERROR)
MAX_MEMORY_ITEMS 1000 Maximum memories per project
MAX_CONTEXT_TOKENS 8000 Token threshold for auto-compression
CLEANUP_INTERVAL_HOURS 24 Auto-cleanup interval
ENABLE_AUTO_CLEANUP true Enable automatic cleanup
MAX_CONCURRENT_REQUESTS 5 Max concurrent requests
REQUEST_TIMEOUT 30 Request timeout in seconds
DATA_DIR ~/ClaudeMemory Where to store data and logs

🧪 Testing

This package is production-ready and does not include a test suite in the distributed version. For development or CI, refer to the repository for test scripts and additional resources.

📊 Performance & Monitoring

The server includes built-in performance tracking:

  • Response time monitoring for all tools
  • Success rate tracking with error counts
  • Memory usage statistics
  • Database performance metrics
  • Automatic health checks

Access via the get_performance_stats() and health_check() tools.

🗄️ Database

  • SQLite for reliable, file-based storage
  • Automatic schema migrations for updates
  • Comprehensive indexing for fast queries
  • Built-in backup and optimization tools
  • Cross-platform compatibility

Default location: ./data/mcp_memory.db

🔍 Semantic Search

Powered by sentence-transformers for intelligent memory retrieval:

  • Natural language queries - "Find memories about database optimization"
  • Similarity-based matching using embeddings
  • Configurable similarity thresholds
  • Automatic model downloading (~90MB on first run)

🧠 Sequential Thinking

Structured reasoning system:

  • 5-stage thinking process: Analysis → Planning → Execution → Validation → Reflection
  • Token optimization: Real-time estimation and compression (30-70% reduction)
  • Context continuity: Intelligent session handoffs with preserved context
  • Auto-extraction: Automatically identifies key points, decisions, and action items
  • Performance tracking: Monitor reasoning chains and optimization metrics

💼 Token Management

Advanced context optimization for high-scale deployments:

  • Smart compression: Pattern-based extraction preserves essential information
  • Token estimation: Real-time calculation for planning and budgeting
  • Context summarization: Automatic conversion of conversations to actionable summaries
  • Session consolidation: Seamless handoffs between conversation sessions
  • Performance analytics: Detailed metrics on compression ratios and response times

📝 Logging

Comprehensive logging system:

  • Daily log rotation in ./logs/ directory
  • Structured logging with timestamps and levels
  • Performance tracking integrated
  • Error tracking with stack traces

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for new functionality
  4. Ensure all tests pass
  5. Submit a pull request

📄 License

MIT License - see LICENSE file for details.

🆘 Support

🏷️ Version History

  • v2.0.2 - Updated package build configuration and license compatibility fixes
  • v2.0.1 - Enhanced features with sequential thinking and project conventions
  • v1.2.0 - Enhanced MCP server with performance monitoring and health checks
  • v1.1.0 - Added semantic search and knowledge graph features
  • v1.0.0 - Initial release with basic memory and task management

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_enhanced_mcp_memory-2.0.8.tar.gz (46.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iflow_mcp_enhanced_mcp_memory-2.0.8-py3-none-any.whl (41.8 kB view details)

Uploaded Python 3

File details

Details for the file iflow_mcp_enhanced_mcp_memory-2.0.8.tar.gz.

File metadata

  • Download URL: iflow_mcp_enhanced_mcp_memory-2.0.8.tar.gz
  • Upload date:
  • Size: 46.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_enhanced_mcp_memory-2.0.8.tar.gz
Algorithm Hash digest
SHA256 512c30e964391ed7421e7fea66a7c57b28ad405ee992c113812bfb7ee8a0079e
MD5 62eca8fb525a5c91fd193e37610aeede
BLAKE2b-256 4ff0bfd4baf6b86b17b741ca756ae029635b381de2293f148d9ca77e542a2b79

See more details on using hashes here.

File details

Details for the file iflow_mcp_enhanced_mcp_memory-2.0.8-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_enhanced_mcp_memory-2.0.8-py3-none-any.whl
  • Upload date:
  • Size: 41.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_enhanced_mcp_memory-2.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 7cf5badbfeeb70e5725a9b5726d671e69539fb1a1f0b2c0510121d6ba0a55fd3
MD5 5d1c8af4a108dc5d460642722663354b
BLAKE2b-256 0b5f3e246efaeb9e813a3251d50a14de953b523b17c2542bbc2454abf6435356

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page