Skip to main content

ISA MCP CLI - Comprehensive command-line interface for MCP server with RAG capabilities

Project description

isA_MCP - AI-Powered Smart MCP Server

๐ŸŽฏ Project Overview

isA_MCP is a sophisticated AI-powered Smart MCP (Model Context Protocol) Server that has evolved into an intelligent, enterprise-grade platform with comprehensive service integrations and automated capability discovery.

๐Ÿค– Key Features

  • ๐Ÿง  AI-Powered Tool Selection - Intelligent tool recommendation based on natural language queries
  • ๐Ÿ“Š Data Analytics Suite - Complete 5-step data processing workflow with LLM-powered SQL generation
  • ๐Ÿ•ธ๏ธ Advanced Web Services - Modern web scraping with anti-detection and JavaScript execution
  • ๐Ÿ›๏ธ E-commerce Integration - Full Shopify integration with cart and checkout management
  • ๐Ÿ“š RAG & Document Analytics - Retrieval-augmented generation with multi-format document processing
  • ๐Ÿ–ผ๏ธ AI Image Generation - Image creation and transformation capabilities
  • ๐Ÿงฎ Memory Management - Persistent information storage with intelligent retrieval
  • ๐Ÿ” Enterprise Security - Multi-level authorization with audit logging
  • ๐Ÿณ Production-Ready - Docker cluster deployment with load balancing

๐Ÿ—๏ธ System Architecture

graph TB
    subgraph "Load Balancer"
        LB[Nginx Load Balancer :8081]
    end
    
    subgraph "Smart MCP Cluster"
        S1[Smart MCP Server :4321]
        S2[Smart MCP Server :4322]
        S3[Smart MCP Server :4323]
    end
    
    subgraph "AI Core"
        AD[Auto Discovery]
        TS[Tool Selector]
        PS[Prompt Selector]
    end
    
    subgraph "Services Layer"
        DA[Data Analytics]
        WS[Web Services]
        RAG[RAG Service]
        SH[Shopify]
        IM[Image Gen]
        MEM[Memory]
        EV[Event Sourcing]
    end
    
    subgraph "Data Layer"
        PG[(PostgreSQL/Supabase)]
        MY[(MySQL)]
        SS[(SQL Server)]
        VEC[(Vector Store)]
    end
    
    LB --> S1
    LB --> S2
    LB --> S3
    
    S1 --> AD
    S2 --> AD
    S3 --> AD
    
    AD --> TS
    AD --> PS
    
    TS --> DA
    TS --> WS
    TS --> RAG
    TS --> SH
    TS --> IM
    TS --> MEM
    TS --> EV
    
    DA --> PG
    DA --> MY
    DA --> SS
    RAG --> VEC
    WS --> PG

๐Ÿš€ Quick Start

Requirements

  • Python 3.11+
  • Docker & Docker Compose
  • PostgreSQL 14+ (with pgvector extension)
  • Redis 6+ (for caching and sessions)

Installation

  1. Clone the repository:
git clone <repository_url>
cd isA_MCP
  1. Install dependencies:
pip install -r requirements.txt
  1. Environment setup:
# Copy and configure environment variables
cp .env.example .env
# Edit .env with your database credentials and API keys
  1. Database setup:
# Start PostgreSQL with pgvector extension
docker-compose up -d postgres
  1. Start the Smart MCP Server:

Option A: Single Server (Development)

python smart_mcp_server.py

Option B: Production Cluster

# Start complete cluster with load balancer
docker-compose up -d
# Access via http://localhost:8081

Option C: Railway Deployment

# One-click deployment to Railway
railway up

๐Ÿ› ๏ธ Services & Capabilities

๐Ÿ“Š Data Analytics Service

Complete 5-Step Data Processing Workflow

  • Step 1: Metadata extraction from databases (PostgreSQL, MySQL, SQL Server) and files
  • Step 2: Semantic enrichment with business entity identification
  • Step 3: Embedding generation and vector storage (pgvector)
  • Step 4: Natural language query matching using semantic similarity
  • Step 5: LLM-powered SQL generation with fallback strategies
  • Tools: data_sourcing, data_query

๐Ÿ•ธ๏ธ Web Services Platform

Modern Web Scraping with AI Enhancement

  • Multi-provider search integration (Brave API)
  • Playwright browser automation with stealth capabilities
  • LLM-powered content extraction with predefined schemas
  • AI-enhanced filtering and relevance scoring
  • Human behavior simulation for anti-detection
  • Tools: scrape_webpage, scrape_multiple_pages, extract_page_links, search_page_content

๐Ÿ“š RAG & Document Analytics

Intelligent Document Processing

  • Supabase pgvector integration for vector storage
  • Multi-format document processing (PDF, DOC, DOCX, PPT, PPTX, TXT)
  • Quick RAG question-answering for documents
  • Multi-collection support with user isolation
  • Tools: search_rag_documents, add_rag_documents, quick_rag_question

๐Ÿ›๏ธ Shopify E-commerce Integration

Complete E-commerce Workflow

  • Product search and catalog management
  • Shopping cart operations and management
  • Checkout and payment processing (test environment)
  • Customer profile and shipping address management
  • Tools: search_products, add_to_cart, view_cart, start_checkout, process_payment

๐Ÿ–ผ๏ธ AI Image Generation

Creative AI Services

  • AI image creation with custom prompts
  • Image-to-image transformation capabilities
  • File-based image generation and storage
  • Tools: generate_image, generate_image_to_file, image_to_image

๐Ÿงฎ Memory Management System

Persistent Information Storage

  • Categorized memory storage with keyword tagging
  • Intelligent memory retrieval and search
  • Secure memory deletion with authorization
  • Tools: remember, forget, update_memory, search_memories

๐Ÿ”„ Event Sourcing & Background Tasks

Asynchronous Processing

  • Background task creation and lifecycle management
  • Event-driven architecture support
  • Task monitoring and control (pause/resume/delete)
  • Tools: create_background_task, list_background_tasks, pause_background_task

๐Ÿ” Security & Administration

Enterprise-Grade Security

  • Multi-level authorization (LOW, MEDIUM, HIGH)
  • JWT-based authentication with bcrypt password hashing
  • Comprehensive audit logging and monitoring
  • Human-in-the-loop interaction workflows
  • Tools: request_authorization, check_security_status, get_audit_log

๐Ÿค– AI-Powered Features

Auto-Discovery System

The Smart MCP Server automatically discovers and registers all available tools, prompts, and resources:

# Auto-discovery extracts metadata from docstrings and function signatures
from core.auto_discovery import AutoDiscovery

discovery = AutoDiscovery()
tools = await discovery.discover_tools()  # Finds all MCP tools
prompts = await discovery.discover_prompts()  # Extracts prompts
resources = await discovery.discover_resources()  # Identifies resources

AI Tool Selection

Intelligent tool recommendation based on natural language queries:

from core.ai_selectors import ToolSelector

selector = ToolSelector()
# Natural language query gets matched to appropriate tools
tools = await selector.select_tools("I need to analyze sales data from my database")
# Returns: ['data_sourcing', 'data_query'] with confidence scores

๐Ÿ“š API Documentation

Data Analytics API

# Extract database metadata and create embeddings
await client.call_tool("data_sourcing", {
    "connection_string": "postgresql://user:pass@host:5432/db",
    "tables": ["sales", "customers"]
})

# Query data using natural language
await client.call_tool("data_query", {
    "query": "Show me top 10 customers by revenue this month",
    "connection_string": "postgresql://user:pass@host:5432/db"
})

Web Services API

# Advanced web scraping with AI extraction
await client.call_tool("scrape_webpage", {
    "url": "https://example.com",
    "extraction_schema": {
        "products": ["name", "price", "description"]
    },
    "use_stealth": True
})

# Multi-page scraping with pagination
await client.call_tool("scrape_multiple_pages", {
    "base_url": "https://example.com/products",
    "max_pages": 10,
    "extraction_schema": {"products": ["name", "price"]}
})

RAG & Document Analytics API

# Quick document Q&A
await client.call_tool("quick_rag_question", {
    "file_path": "/path/to/document.pdf",
    "question": "What are the key findings in this report?"
})

# Add documents to RAG collection
await client.call_tool("add_rag_documents", {
    "collection_name": "company_docs",
    "documents": ["Document content..."],
    "metadatas": [{"source": "report.pdf"}]
})

# Search RAG documents
await client.call_tool("search_rag_documents", {
    "collection_name": "company_docs",
    "query": "quarterly results",
    "n_results": 5
})

Shopify E-commerce API

# Search products
await client.call_tool("search_products", {
    "query": "wireless headphones",
    "limit": 10
})

# Add to cart and checkout
await client.call_tool("add_to_cart", {
    "product_id": "12345",
    "quantity": 2
})

await client.call_tool("start_checkout", {
    "cart_id": "cart_123"
})

Memory Management API

# Store information with categories
await client.call_tool("remember", {
    "key": "customer_preferences",
    "value": "Prefers email notifications",
    "category": "customer_data",
    "keywords": ["email", "notifications", "preferences"]
})

# Search memories
await client.call_tool("search_memories", {
    "query": "customer email preferences"
})

๐Ÿงช Testing & Quality Assurance

Comprehensive Test Suite

The project includes 40+ test files covering all major components:

# Run all tests
pytest

# Run specific service tests
pytest tests/test_data_analytics_service.py
pytest tests/test_web_services.py
pytest tests/test_rag_operations.py
pytest tests/test_shopify_integration.py

# Run performance benchmarks
pytest tests/test_performance.py -v

Test Coverage Areas

  • โœ… Data Analytics: Complete 5-step workflow testing
  • โœ… Web Services: Scraping, extraction, and anti-detection
  • โœ… RAG & Documents: Multi-format processing and Q&A
  • โœ… E-commerce: Shopping cart and checkout workflows
  • โœ… Security: Authentication, authorization, and audit logging
  • โœ… AI Features: Tool selection and auto-discovery
  • โœ… Integration: End-to-end service interactions

Performance Metrics

  • AI Tool Selection: <200ms response time
  • Web Scraping Success Rate: 95%+
  • Database Query Performance: <500ms average
  • Memory Usage per Container: ~500MB
  • Docker Container Startup: <30 seconds

๐Ÿ“ Development Standards

  • Code Quality: PEP 8 compliance with type annotations
  • Testing: Comprehensive unit and integration tests
  • Architecture: Async-first with proper error handling
  • Security: Multi-level authorization and audit logging
  • Documentation: Complete docstrings and API documentation
  • Performance: Optimized for production workloads

๐Ÿ”ง Troubleshooting

Common Issues

  1. Database Connection Errors
# Check PostgreSQL status
docker-compose ps postgres

# Verify pgvector extension
psql -h localhost -U postgres -c "SELECT * FROM pg_extension WHERE extname='vector';"
  1. AI Tool Selection Issues
# Check if embeddings are properly generated
python -c "from core.auto_discovery import AutoDiscovery; print(AutoDiscovery().get_tool_embeddings())"
  1. Web Scraping Failures
# Check Playwright browser installation
playwright install chromium

# Verify anti-detection settings
python -c "from tools.services.web_services.browser_manager import BrowserManager; BrowserManager().test_stealth()"
  1. Memory Issues in Production
# Monitor container memory usage
docker stats

# Check Redis cache status
redis-cli ping

๐Ÿš€ Deployment Options

Development (Local)

# Single server for development
python smart_mcp_server.py

# Access at http://localhost:4321

Production (Docker Cluster)

# Start complete cluster with load balancer
docker-compose up -d

# 3 Smart MCP servers + Nginx load balancer
# Access at http://localhost:8081

Cloud Deployment (Railway)

# One-click deployment to Railway
railway up

# Automatic SSL, scaling, and monitoring
# Cost: $10-25/month

Manual Docker Build

# Build production image
docker build -f Dockerfile.production -t isa-mcp:latest .

# Run with environment variables
docker run -d \
  -p 4321:4321 \
  -e DATABASE_URL=postgresql://... \
  -e REDIS_URL=redis://... \
  isa-mcp:latest

๐Ÿ“Š Performance & Monitoring

Health Monitoring

# Check cluster health
curl http://localhost:8081/health

# Individual server health
curl http://localhost:4321/health
curl http://localhost:4322/health
curl http://localhost:4323/health

Metrics Collection

  • Prometheus: Metrics collection and alerting
  • Grafana: Performance dashboards
  • Audit Logging: Security and compliance tracking
  • Load Balancer Stats: Request distribution and response times

Production Readiness Checklist

  • โœ… Docker cluster with load balancing
  • โœ… Health monitoring and automatic failover
  • โœ… Security: JWT auth, bcrypt passwords, audit logs
  • โœ… Performance: <200ms AI tool selection, 95%+ uptime
  • โœ… Testing: 40+ test files, integration tests
  • โœ… Documentation: Complete API docs and deployment guides

๐ŸŒŸ Key Differentiators

  1. AI-Powered Intelligence: Unlike traditional MCP servers, provides intelligent tool selection
  2. Enterprise Architecture: Production-grade cluster deployment with monitoring
  3. Comprehensive Integration: 35+ tools across 11 service categories
  4. Modern Web Platform: Advanced scraping with anti-detection capabilities
  5. Extensible Design: Modular architecture for easy service addition

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Development Setup

# Install development dependencies
pip install -r requirements-dev.txt

# Run tests before submitting
pytest

# Run linting
flake8 --config .flake8
black --check .

๐Ÿ”— Links & Resources

๐Ÿ“„ License

MIT License - see LICENSE file for details.


Status: ๐ŸŸข Production Ready | Version: 2.0.0 | Last Updated: 2024-12-30

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

isa_mcp_cli-1.0.0.tar.gz (16.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

isa_mcp_cli-1.0.0-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file isa_mcp_cli-1.0.0.tar.gz.

File metadata

  • Download URL: isa_mcp_cli-1.0.0.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.8

File hashes

Hashes for isa_mcp_cli-1.0.0.tar.gz
Algorithm Hash digest
SHA256 0874a1efa004e38f580597f9e6feae5e2ae8292ddf8066e701e9af9a64cc10bf
MD5 c1ba8ae40d6e0a3db4bb1027fbaf3dba
BLAKE2b-256 66f26c3bbc9f4aca338c28a1a69c4620aaa8d370a779fe670dee01854d084c92

See more details on using hashes here.

File details

Details for the file isa_mcp_cli-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: isa_mcp_cli-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 16.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.8

File hashes

Hashes for isa_mcp_cli-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2029b328975e3165e07b8c1e8518f694bcf20718197b046335ab349b3cc17fed
MD5 b379fd89e44980018f754c534654b85e
BLAKE2b-256 73086e3d0ee839b47fc60c172d8081dc933f16771a3b675935f8e619bfe723e8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page