Skip to main content

MCP server for intelligent log file analysis and debugging assistance

Project description

Log Analyzer MCP

PyPI version License: MIT Python 3.10+

An MCP (Model Context Protocol) server for AI-powered log analysis. Parse, search, and debug log files directly in Claude Code or any MCP-compatible client.

Features

  • Auto-Detection — Identifies log format from 9+ common formats
  • Smart Search — Pattern matching with context lines, regex support, and time filtering
  • Error Extraction — Groups similar errors, captures stack traces, counts occurrences
  • Summarization — Generates debugging insights with anomaly detection
  • Correlation — Finds related events around error occurrences
  • Real-time Watching — Monitor logs for new entries with position tracking
  • Pattern Suggestions — AI-powered pattern discovery for debugging
  • Trace Extraction — Extract and follow trace/correlation IDs across distributed systems
  • Multi-File Analysis — Merge, correlate, and compare logs across multiple files
  • Natural Language Queries — Ask questions about logs in plain English
  • Sensitive Data Detection — Scan for PII, credentials, and secrets in logs
  • Streaming — Handles large files (1GB+) without loading into memory
  • Multiple Formats — Markdown and JSON output

Supported Log Formats

Format Example Pattern
Syslog Jan 15 10:30:00 hostname process[pid]: message
Apache/Nginx Access 127.0.0.1 - - [15/Jan/2026:10:30:00 +0000] "GET /path" 200
Apache/Nginx Error [Thu Jan 15 10:30:00 2026] [error] [pid 1234] message
JSON Lines {"timestamp": "...", "level": "ERROR", "message": "..."}
Docker/Container 2026-01-15T10:30:00.123Z stdout message
Python Logging 2026-01-15 10:30:00,123 - module - ERROR - message
Java/Log4j 2026-01-15 10:30:00,123 ERROR [thread] class - message
Kubernetes level=error msg="..." ts=2026-01-15T10:30:00Z
Generic Timestamp Any line with recognizable timestamp

Installation

Quick Install (Recommended)

uvx codesdevs-log-analyzer install

This automatically adds the MCP server to your Claude Code settings. Restart Claude Code to start using it.

To uninstall:

uvx codesdevs-log-analyzer uninstall

Manual Installation

pip

pip install codesdevs-log-analyzer

uv

uv tool install codesdevs-log-analyzer

Claude Code

Add to your ~/.claude/settings.json:

{
  "mcpServers": {
    "log-analyzer": {
      "command": "uvx",
      "args": ["codesdevs-log-analyzer"]
    }
  }
}

Or if installed via pip:

{
  "mcpServers": {
    "log-analyzer": {
      "command": "codesdevs-log-analyzer"
    }
  }
}

Restart Claude Code and the tools will be available.

Usage

With Claude Code

Just describe what you need:

Analyze /var/log/nginx/error.log and tell me what's causing the 502 errors
Search for "timeout" in my app.log with 5 lines of context before and after
Give me a summary of errors from /var/log/app.log in the last hour
What happened in the 60 seconds before each OutOfMemoryError in my Java logs?

Available Tools

Tool Description
log_analyzer_parse Detect format, extract metadata, show samples
log_analyzer_search Search patterns with context
log_analyzer_extract_errors Extract and group all errors
log_analyzer_summarize Generate debugging summary
log_analyzer_tail Get recent log entries
log_analyzer_correlate Find events around anchor patterns
log_analyzer_diff Compare log files or time periods
log_analyzer_watch Watch log file for new entries (polling-based)
log_analyzer_suggest_patterns Suggest useful search patterns based on log content
log_analyzer_trace Extract and follow trace/correlation IDs across log entries
log_analyzer_multi Analyze and correlate logs across multiple files
log_analyzer_ask Translate natural language questions into tool calls
log_analyzer_scan_sensitive Detect PII, credentials, and secrets in logs
log_analyzer_suggest_format Suggest log format based on content analysis

Examples

Analyze a Log File

Prompt:

Analyze /var/log/app.log

Output:

## Log Analysis: /var/log/app.log

**Format:** Python logging (confidence: 98%)
**Lines:** 15,432 parsed
**Time Range:** 2026-01-15 00:00:01 → 23:59:58

### Level Distribution
ERROR  ████████░░░░░░░░░░░░  1,234 (8%)
WARN   ██████████░░░░░░░░░░  2,345 (15%)
INFO   ████████████████████  11,853 (77%)

### Sample Entries
[First 5 and last 5 entries shown]

Search with Context

Prompt:

Search for "connection refused" in /var/log/nginx/error.log with 3 lines context

Output:

## Search Results: "connection refused"

Found **23 matches** in 5,432 lines

### Match 1 (line 1234)

[context before] 2026-01-15 10:30:00 [error] connect() failed: Connection refused [context after]

...

Extract Errors with Stack Traces

Prompt:

Extract all errors from /var/log/java-app.log, group similar ones

Output:

## Errors: /var/log/java-app.log

**Total:** 456 errors (23 unique patterns)

### 1. NullPointerException (187 occurrences)
- **First:** 2026-01-15 03:45:12
- **Last:** 2026-01-15 22:15:33
- **Sample:**

java.lang.NullPointerException: Cannot invoke method on null at com.example.UserService.getUser(UserService.java:45) at com.example.ApiController.handleRequest(ApiController.java:123)

...

Watch Logs for New Errors

Prompt:

Watch /var/log/app.log for new errors while I test my changes

Usage:

# First call - get current position
log_analyzer_watch(file_path="/var/log/app.log", from_position=0)
# Returns: current_position=123456

# After triggering action - check for new errors
log_analyzer_watch(file_path="/var/log/app.log", from_position=123456, level_filter="ERROR")
# Returns: new_entries=[...], current_position=234567

Get Pattern Suggestions

Prompt:

What patterns should I search for in /var/log/app.log to debug this issue?

Output:

## Suggested Patterns for /var/log/app.log

### High Priority

1. **Database Connection Errors** (23 matches)
   - Pattern: `connection (refused|timeout|reset)`
   - Example: "connection refused to postgres:5432"

2. **Authentication Failures** (15 matches)
   - Pattern: `(auth|login|authentication) failed`
   - Example: "authentication failed for user admin"

### Medium Priority

3. **Request IDs** (1,234 matches)
   - Pattern: `req-[a-f0-9]{8}`
   - Use for tracing specific requests

Tool Parameters

log_analyzer_parse

Parameter Type Default Description
file_path string required Path to log file
format_hint string auto Force specific format
max_lines int 10000 Lines to analyze
response_format string markdown markdown or json

log_analyzer_search

Parameter Type Default Description
file_path string required Path to log file
pattern string required Search pattern
is_regex bool false Use regex matching
context_lines int 3 Lines before/after
max_matches int 50 Maximum results
level_filter string null Filter by level
time_start string null Filter from time
time_end string null Filter until time

log_analyzer_extract_errors

Parameter Type Default Description
file_path string required Path to log file
include_warnings bool false Include WARN level
group_similar bool true Group similar errors
max_errors int 100 Maximum errors

log_analyzer_summarize

Parameter Type Default Description
file_path string required Path to log file
focus string all errors, performance, security, all
max_lines int 10000 Lines to analyze

log_analyzer_correlate

Parameter Type Default Description
file_path string required Path to log file
anchor_pattern string required Pattern to correlate around
window_seconds int 60 Time window
max_anchors int 10 Maximum anchor points

log_analyzer_watch

Watch a log file for new entries using position-based polling. Useful for real-time monitoring.

Parameter Type Default Description
file_path string required Path to log file
from_position int 0 File position to start from (0 = get current end)
max_lines int 100 Maximum lines to read per call
level_filter string null Filter by level (e.g., ERROR or ERROR,WARN)
pattern_filter string null Regex pattern to filter messages

Usage Flow:

  1. First call with from_position=0 returns current file position
  2. Subsequent calls with returned position get new entries
  3. Repeat to "watch" for new log entries

log_analyzer_suggest_patterns

Analyze a log file and suggest useful search patterns based on content analysis.

Parameter Type Default Description
file_path string required Path to log file
focus string all Focus area: all, errors, security, performance, identifiers
max_patterns int 10 Maximum patterns to suggest
max_lines int 10000 Lines to analyze

Focus Areas:

  • all — Analyze all pattern categories
  • errors — Focus on error message patterns
  • security — Focus on auth failures, unauthorized access
  • performance — Focus on slow requests, timeouts
  • identifiers — Focus on UUIDs, request IDs, user IDs

log_analyzer_trace

Extract and follow trace/correlation IDs across log entries.

Parameter Type Default Description
file_path string required Path to log file
trace_id string null Specific trace ID to extract (null = all traces)
trace_patterns list auto Custom regex patterns for trace IDs
max_traces int 50 Maximum traces to return
max_lines int 100000 Lines to scan

Auto-detected formats: OpenTelemetry, UUID, AWS X-Ray, custom patterns.

log_analyzer_multi

Analyze and correlate logs across multiple files.

Parameter Type Default Description
file_paths list required List of log file paths
operation string merge merge, correlate, or compare
time_window_seconds int 60 Time window for correlation
max_entries int 1000 Maximum entries to return

Operations:

  • merge — Interleave entries by timestamp
  • correlate — Find events across files within time window
  • compare — Diff error patterns between files

log_analyzer_ask

Translate natural language questions into tool calls.

Parameter Type Default Description
file_path string required Path to log file
question string required Natural language question about the logs

Example Questions:

  • "What errors happened in the last hour?"
  • "Show me failed login attempts"
  • "What's causing the high latency?"

log_analyzer_scan_sensitive

Scan for PII, credentials, and sensitive data in logs.

Parameter Type Default Description
file_path string required Path to log file
redact bool false Redact sensitive data in output
categories list all Filter: email, credit_card, api_key, password, ssn, ip_address, phone, token, connection_string, private_key
max_matches int 100 Maximum matches to return
max_lines int 100000 Lines to scan

Detects:

  • PII: Emails, credit cards, SSNs, phone numbers
  • Credentials: API keys, JWT tokens, AWS keys, Bearer tokens
  • Secrets: Passwords in URLs, private keys, database connection strings

log_analyzer_suggest_format

Suggest log format based on content analysis.

Parameter Type Default Description
file_path string required Path to log file
sample_lines int 100 Lines to sample

Development

Setup

git clone https://github.com/Fato07/log-analyzer-mcp
cd log-analyzer-mcp
uv sync

Run Tests

uv run pytest -v --cov

Type Checking

uv run mypy codesdevs_log_analyzer

Run Locally

uv run codesdevs-log-analyzer

Test with MCP Inspector

npx @modelcontextprotocol/inspector uv run codesdevs-log-analyzer

License

MIT License - see LICENSE for details.

Contributing

Contributions welcome! Please open an issue or submit a pull request.

Links


Built by Fato07 at CodesDevs

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codesdevs_log_analyzer-0.4.1.tar.gz (198.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codesdevs_log_analyzer-0.4.1-py3-none-any.whl (113.3 kB view details)

Uploaded Python 3

File details

Details for the file codesdevs_log_analyzer-0.4.1.tar.gz.

File metadata

  • Download URL: codesdevs_log_analyzer-0.4.1.tar.gz
  • Upload date:
  • Size: 198.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codesdevs_log_analyzer-0.4.1.tar.gz
Algorithm Hash digest
SHA256 a1eba06046987227c277e0190e7a7be50311f40b513f72b3bcc29bd4bc268f4e
MD5 ca389bd4cca221b8ed1731feb42055a5
BLAKE2b-256 a2c3b6c8a1683fa9b984f8110b0732fef12788f10c12f5127d444b588711b786

See more details on using hashes here.

Provenance

The following attestation bundles were made for codesdevs_log_analyzer-0.4.1.tar.gz:

Publisher: publish.yml on Fato07/log-analyzer-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file codesdevs_log_analyzer-0.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for codesdevs_log_analyzer-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d37733b874bbb00827296bbf5758cb961040d0a433d39e15bc94306364fa18de
MD5 5bee8b123ad77141f3ff7296cc2ca766
BLAKE2b-256 ee5f3aec31b7b1d372403e77a60179b50e3ed82cf32e5a2d4d54b06b22095d1d

See more details on using hashes here.

Provenance

The following attestation bundles were made for codesdevs_log_analyzer-0.4.1-py3-none-any.whl:

Publisher: publish.yml on Fato07/log-analyzer-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page