Skip to main content

VIPER - Vulnerability Intelligence, Prioritization, and Exploitation Reporter MCP Server

Project description

VIPER Project Banner

Last Commit GitHub Stars GitHub Forks

Docker Support MCP Support Tests: Passing Security Rating visitors

Google Gemini Python Claude Docker

🛡️ VIPER - Vulnerability Intelligence, Prioritization, and Exploitation Reporter

VIPER is your AI-powered co-pilot in the complex world of cyber threats, designed to provide actionable Vulnerability Intelligence, Prioritization, and Exploitation Reporting.

In an era of ever-increasing cyber threats, VIPER cuts through the noise. It ingests data from critical sources like NVD, EPSS, and the CISA KEV catalog, then leverages Google Gemini AI for deep contextual analysis and vulnerability prioritization. All this intelligence is centralized, enriched, and presented through multiple interfaces: an interactive Streamlit dashboard, a powerful CLI, and now integrated with Claude Desktop through MCP (Model Context Protocol) for natural language vulnerability analysis.

🚀 NEW: Claude Desktop Integration via MCP

VIPER now includes a Model Context Protocol (MCP) server that integrates seamlessly with Claude Desktop, providing 12 powerful cybersecurity tools accessible through natural language:

📺 Viper MCP Demo

Watch the VIPER Demo Video

  • perform_live_cve_lookup - Full CVE analysis
  • get_nvd_cve_details - NVD data
  • get_epss_data_for_cve - Exploitation probability
  • check_cve_in_cisa_kev - CISA KEV status
  • search_public_exploits_for_cve - GitHub/Exploit-DB search
  • get_gemini_cve_analysis - AI analysis
  • get_viper_risk_score - Risk scoring
  • save_cve_data_to_viperdb - Database storage
  • And 4 more...

Usage Examples:

"Analyze CVE-2024-3400 with full Viper analysis"
"Find exploits for CVE-2023-44487"
"Check if CVE-2024-1234 is in CISA KEV"

🔧 Quick Setup

1. Install

git clone https://github.com/ozanunal0/viper.git
cd viper
./setup.sh

2. Configure

cp env.example .env
# Edit .env with your settings:
# - GEMINI_API_KEY for Gemini AI (default provider)
# - Or set LLM_PROVIDER=ollama for local LLM
# - Or set LLM_PROVIDER=openai and provide OPENAI_API_KEY (+ optional OPENAI_MODEL_NAME)

3. Claude Desktop MCP

{
  "mcpServers": {
    "ViperMCPServer": {
      "command": "/FULL/PATH/TO/viper/run_mcp_clean.sh"
    }
  }
}

📊 Screenshots

Home Screen

Home

Main Dashboard

main dashboard.png

Detailed Analysis View

details va.png

Live CVE Lookup

live cve lookup.png

Analytics & Trends

analytics.png


🖥️ Usage

Claude Desktop (Recommended):

  • Natural language vulnerability analysis
  • Real-time CVE lookups
  • Risk scoring and prioritization

Dashboard:

python main.py dashboard

CLI:

python main.py cli --days 7

🏠 Local LLM with Ollama

For privacy-focused analysis without external API dependencies:

Quick Setup

# 1. Set LLM provider to Ollama
echo "LLM_PROVIDER=ollama" >> .env

# 2. Start with Docker Compose (includes Ollama)
docker-compose up -d

# 3. Pull a model (run once)
docker exec -it viper_ollama ollama pull llama3:8b

# 4. Access VIPER at http://localhost:8501

Available Models

  • llama3:8b - Good balance of speed and quality (default)
  • llama3:70b - Higher quality, requires more resources
  • codellama:7b - Optimized for code analysis
  • mistral:7b - Fast and efficient

Configuration

# In .env file
LLM_PROVIDER=ollama
OLLAMA_API_BASE_URL=http://localhost:11434  # or http://ollama:11434 in Docker
LOCAL_LLM_MODEL_NAME=llama3:8b

☁️ Cloud LLM with OpenAI

Use OpenAI GPT models as the provider for AI analysis.

Configuration

# In .env file
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-...
OPENAI_MODEL_NAME=gpt-4o-mini
# OPENAI_BASE_URL=https://api.openai.com/v1  # optional override

✨ Features

  • Multi-source data: NVD, EPSS, CISA KEV, Microsoft
  • Flexible AI analysis: Choose between Gemini AI, OpenAI GPT, or local Ollama models
  • Risk scoring: Weighted multi-factor scoring
  • Live lookup: Real-time CVE analysis
  • Multiple interfaces: Dashboard, CLI, Claude Desktop
  • Privacy options: Local LLM support for offline/private analysis

📚 Documentation

Project Roadmap & Future Vision

Here's where we're headed:

Phase 1: Core Enhancements & Data Completeness (Immediate Focus)

Full NVD API Pagination: Ensure complete ingestion of all relevant CVEs from NVD by implementing robust pagination in nvd_client.py to handle large result sets (addressing current partial data fetching ).

Solidify Retry Mechanisms: Continuously refine and test tenacity based retry logic across all external API clients (nvd_client.py, epss_client.py, cisa_kev_client.py, microsoft_update_client.py, gemini_analyzer.py) for maximum resilience.

✅ Dashboard Usability & Features:

Refine real-time CVE lookup: Optimize display and ensure all enrichment (EPSS, KEV, MSData, Gemini re-analysis) is available for live queries.

Enhance filtering and sorting options on all data tables.

Implement detailed CVE view modals or dedicated pages for better readability of all enriched data.

🚧 Automated Periodic Execution: Integrate APScheduler or configure system cron jobs to run the main_mvp.py data pipeline automatically at configurable intervals.

Phase 2: Expanding Data Ingestion & Enrichment

  • [✅] Local LLM Support (Ollama Integration):
    • ✅ Implemented local LLM support through Ollama for enhanced privacy and offline capabilities.
    • ✅ AI-powered vulnerability analysis without external API dependencies.
    • ✅ Support for popular models like Llama3, Code Llama, and other Ollama-compatible models.
    • ✅ Configurable model selection and deployment options via environment variables.

Other CISA Products & Feeds: Explore and integrate other relevant CISA feeds beyond the KEV catalog (e.g., CISA Alerts, Industrial Control Systems Advisories if applicable). Explore and integrate other relevant CISA feeds beyond the KEV catalog (e.g., CISA Alerts, Industrial Control Systems Advisories if applicable).

Comprehensive Microsoft Patch Tuesday Parsing: Further refine microsoft_update_client.py to ensure accurate and detailed extraction of product families, specific product versions, and direct links to KB articles/MSRC guidance from CVRF/CSAF to ensure accurate and detailed extraction of product families, specific product versions, and direct links to KB articles/MSRC guidance from CVRF/CSAF data.

Phase 3: Developing "Threat Analyst Agent" Capabilities

  • [🚧] Semantic Web Search Integration (EXA AI):
    • For high-priority CVEs or emerging threats, automatically search the web for technical analyses, blog posts, news articles, and threat actor reports.
    • Store relevant article metadata (URL, title, snippet, source) linked to CVEs.
  • [🚧] AI-Powered Content Analysis (Gemini):
    • Summarization: Use Gemini to summarize fetched articles and reports related to a CVE.
    • Key Information Extraction: Extract TTPs (Tactics, Techniques, and Procedures), affected software/hardware, and potential mitigations from unstructured text.
    • Cross-Validation Support: Assist analysts by comparing information from different sources regarding a specific threat.

Phase 4: Building "Threat Hunting Agent" Foundations

  • [📝] Enhanced IOC Extraction:
    • Expand IOC (IPs, domains, hashes, URLs, mutexes, registry keys) extraction from all ingested text sources (NVD descriptions, MSRC summaries, KEV details, fetched articles) using Gemini's advanced understanding or specialized libraries like iocextract.
    • Create a robust, searchable IOC database.
  • [📝] Natural Language to Query Translation (Advanced):
    • Leverage Gemini to translate natural language threat hunting hypotheses (e.g., "Are there any Cobalt Strike beacons communicating with newly registered domains?") into structured query formats like OCSF, KQL (Azure Sentinel), or Splunk SPL.

Phase 5: Broader Intelligence Gathering & Advanced Analytics

  • [📝] Social Media Monitoring & Clustering (Advanced):
    • Ingest data from platforms like Twitter/X or specific Reddit communities (e.g., r/netsec) for early signals of new vulnerabilities or exploits.
    • Apply LLM-based semantic clustering (Gemini) to group discussions and identify emerging threat trends.
  • [📝] Threat Actor & Malware Profiling:
    • Begin associating CVEs and IOCs with known threat actors and malware families (potentially integrating with MISP or other OSINT feeds).
    • Visualize these relationships in the dashboard.
  • [📝] Advanced Dashboard Analytics:
    • Implement more sophisticated trend analysis, predictive insights (beyond EPSS), and customizable reporting features.

Phase 6: Platform Maturity & Usability

  • [📝] User Accounts & Collaboration (Long-term): Allow multiple users, role-based access, and collaborative analysis features (e.g., shared notes, investigation assignments).
  • [📝] Notification System: Implement email or other notifications for high-priority alerts or newly discovered critical CVEs matching predefined criteria.
  • [📝] Database Optimization/Migration: For larger deployments, consider migrating from SQLite to a more scalable database like PostgreSQL.

Star ⭐ the repo if VIPER helps with your vulnerability management!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_ozanunal0_viper-1.0.9.tar.gz (131.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iflow_mcp_ozanunal0_viper-1.0.9-py3-none-any.whl (140.7 kB view details)

Uploaded Python 3

File details

Details for the file iflow_mcp_ozanunal0_viper-1.0.9.tar.gz.

File metadata

  • Download URL: iflow_mcp_ozanunal0_viper-1.0.9.tar.gz
  • Upload date:
  • Size: 131.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_ozanunal0_viper-1.0.9.tar.gz
Algorithm Hash digest
SHA256 6615cfe387de51f1658eb5784a9cc2cdc9b0476ad2fd27b58faf1731d2bab4c6
MD5 32b21da53b817f676ee388f10d5710d9
BLAKE2b-256 e08dec878028b2055649c5956ae773428e5f8fd68c5dc28d14ca7938d667a811

See more details on using hashes here.

File details

Details for the file iflow_mcp_ozanunal0_viper-1.0.9-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_ozanunal0_viper-1.0.9-py3-none-any.whl
  • Upload date:
  • Size: 140.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_ozanunal0_viper-1.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 3454cbfb0ad9d4e926dc98a9917d496630c840249b4c28baa0b9fa0a868a9a19
MD5 88c6d44ae80b3345e6d4572276a81db5
BLAKE2b-256 23bfbc8ffc0412f189db2f8a810effee9d013f7d3ead6bb61ca163b6f98954e0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page