MCP server for HeapDumpStarDiver - JVM heap dump analysis with DuckDB
Project description
HeapDumpStarDiver MCP Server
MCP (Model Context Protocol) server for JVM heap dump analysis. Converts HPROF heap dumps to Parquet files and provides DuckDB-powered SQL querying and automated memory waste detection — all accessible to any MCP-compatible AI agent.
Install
pip install heapdump-stardiver-mcp
Prerequisites
The convert_heap_dump tool requires the HeapDumpStarDiver Rust binary. Build it from source:
git clone https://github.com/ZacAttack/HeapDumpStarDiver.git
cd HeapDumpStarDiver
cargo build --release
Or set HEAP_DUMP_STAR_DIVER_BINARY_OVERRIDE to point at a pre-built binary.
If you only need to analyze existing Parquet files (via open_session), the Rust binary is not required.
Agent Configuration
Add to your agent's MCP config:
{
"mcpServers": {
"heapdump-stardiver": {
"command": "heapdump-stardiver-mcp"
}
}
}
| Agent | Config file |
|---|---|
| Claude Code | .mcp.json in repo root |
| Claude Desktop | ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) |
| Cursor | .cursor/mcp.json in repo root |
| Kiro | .kiro/settings/mcp.json in repo root |
Available Tools
| Tool | Description |
|---|---|
convert_heap_dump |
Convert HPROF → Parquet and open analysis session |
open_session |
Open session from existing Parquet files |
list_sessions |
Show all active sessions |
close_session |
Close DuckDB connection, keep files |
cleanup_session |
Close connection and delete Parquet files |
list_parquet_files |
List tables with row counts and schemas |
query_heap |
Run DuckDB SQL against Parquet (paginated) |
analyze_heap |
Automated waste detection and heap profiling |
Typical Workflow
convert_heap_dump(hprof_path="/path/to/dump.hprof")— convert and open sessionlist_parquet_files()— discover available tablesanalyze_heap()— automated waste detection (duplicate strings, bad collections, boxed primitives, etc.)query_heap(sql="SELECT ...")— ad-hoc DuckDB queriesclose_session(id)orcleanup_session(id, confirm=True)
Waste Analysis
The analyze_heap tool detects common JVM memory waste patterns across 3 tiers:
- Tier 1 (fast): Duplicate strings, empty/single-element collections, bad arrays, boxed primitives
- Tier 2 (default): + collection sizing, duplicate byte arrays, class count, GC roots, DirectByteBuffers, thread stacks
- Tier 3 (thorough): + duplicate object arrays, estimated shallow sizes
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file heapdump_stardiver_mcp-0.1.2.tar.gz.
File metadata
- Download URL: heapdump_stardiver_mcp-0.1.2.tar.gz
- Upload date:
- Size: 24.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
53395ec140f9c106509a8bdc3524e841bf2a2b98064d766489ef2dbb350dca7f
|
|
| MD5 |
b5b76ac60776024735722346b8ec0fd2
|
|
| BLAKE2b-256 |
006a7043cea2b03b7cbda7d4c45ae6fd81d96bed07fbc7169d633daf27631ec7
|
File details
Details for the file heapdump_stardiver_mcp-0.1.2-py3-none-any.whl.
File metadata
- Download URL: heapdump_stardiver_mcp-0.1.2-py3-none-any.whl
- Upload date:
- Size: 22.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
51f22c398ca1cb98639ceadc1decefe41a45655fa2623e1eaf9d21f6eac204f1
|
|
| MD5 |
c7b734f753ad2dbafdaf273ec8b335a7
|
|
| BLAKE2b-256 |
194bef0219a086fa0908fdeea4194826e7472f57c07cf210565805270a7dd67c
|