Skip to main content

HDF5 tools and semantic metadata for agentic workflows

Project description

Agentic HDF5

Test Agentic HDF5 Tools GitHub tag codecov

A set of expansions and tools for Claude Code that enable AI agents to work at a high level with HDF5 data and files. Provides 10+ tools, 14+ skills, support for semantic metadata, and natural language search of vectorized semantic metadata.

Prerequisites

Claude Code must be installed. The MCP server is fetched and run via uvx, so uv must also be installed — all Python dependencies (h5py, numpy, matplotlib, etc.) are resolved automatically.

Installation

Run these slash commands inside a Claude Code session to register the plugin marketplace and install the plugin:

# Add the marketplace (one-time)
/plugin marketplace add mattjala/agentic-hdf5

# Install the plugin
/plugin install ahdf5-plugin@agentic-hdf5

This gives you all 14+ skills and 10 MCP tools automatically. After installing, restart your Claude Code session for the plugin to take effect.

For development/testing, clone the repo and load the plugin directly:

git clone https://github.com/mattjala/agentic-hdf5.git
claude --plugin-dir ./agentic-hdf5/plugin

Architecture

Agentic HDF5 is composed of two complementary layers — tools and skills — that can be used independently or together.

Tools

Tools are Python functions that agents call to perform concrete operations on HDF5 files. They handle the actual file I/O: reading metadata, rechunking datasets, applying compression filters, writing semantic metadata, generating visualizations, and running semantic searches. Tools live in the tools/ directory and are registered in tools/tool_catalog.json.

Tool Description
get_object_metadata Inspect dataset/group properties (shape, dtype, chunks, compression)
rechunk_dataset Modify chunk layout (larger, smaller, exact dimensions, contiguous)
apply_filter_dataset Apply or remove compression filters (gzip, szip, shuffle, etc.)
visualize Generate plots from datasets (line, heatmap, histogram, contour, etc.)
read_semantic_metadata Read semantic metadata (SMD) from an HDF5 object
write_semantic_metadata Write or update SMD on a single object
collect_objects_for_smd Scan a file for objects missing SMD
write_smd_batch Write SMD to multiple objects in a single transaction
vectorize_semantic_metadata Embed all SMD into vector representations for search
query_semantic_metadata Natural language semantic search over vectorized SMD

Skills

Skills are curated knowledge documents that teach the agent how and when to apply HDF5 best practices. They are loaded on-demand when a user's request matches the skill's domain, giving the agent expert-level guidance without bloating its context on every interaction. Skills live in .claude/skills/.

Skill Domain
hdf5-chunking Chunk layout strategies and optimization
hdf5-filters Compression and filter selection
hdf5-io General I/O performance tuning
hdf5-cloud-optimized Cloud/S3 access, paged aggregation, ros3 VFD
hdf5-core-vfd In-memory file driver
hdf5-parallel MPI-IO and parallel HDF5
hdf5-swmr Single Writer Multiple Reader access
hdf5-vds Virtual datasets across multiple files
hdf5-vol-usage Using VOL connectors (DAOS, Async, Cache, REST)
hdf5-vol-dev Developing custom VOL connectors
hdf5-visualization Plot type selection and matplotlib guidance
hdf5-scientific-publishing DOIs, Zenodo/Dataverse, FAIR data practices
hdf5-omni-selective OMNI file creation for selective data download
hdf5-optimization General HDF5 optimization scripts

How They Work Together

Tools and skills are designed to complement each other but neither requires the other:

  • Skills alone — An agent can use skill knowledge to advise on HDF5 best practices (e.g., recommending a chunk layout) without modifying any files.
  • Tools alone — An agent can call tools to inspect, optimize, or annotate files using the tool's built-in logic, without loading any skill context.
  • Skills + Tools — The most powerful mode. A skill provides the agent with expert knowledge (e.g., chunking strategies for cloud access patterns), and the agent then uses tools to apply that knowledge to specific files (e.g., rechunking a dataset with the recommended layout).

Semantic Metadata (SMD)

Semantic metadata attributes (ahdf5-smd-*) attach human-readable, structured descriptions to HDF5 objects — describing what data represents, its provenance, units, and scientific significance. SMD bridges the gap between raw array data and human understanding.

See docs/semantic-metadata.md for the full specification.

Vectorized Semantic Metadata (VSMD)

VSMD converts text-based SMD into vector embeddings stored directly in the HDF5 file, enabling natural language search over datasets. An agent (or user) can query "temperature measurements in Celsius" and retrieve the most semantically relevant objects — without needing to know paths or attribute names.

See docs/vectorized-semantic-metadata.md for the design document.

Testing

python -m pytest tests/

Agent Tool Selection Evaluation

The tests/agent_tool_selection/ suite evaluates whether Claude models correctly identify the right HDF5 tool from natural language prompts. Run from a normal terminal (not inside Claude Code):

pytest -m agent tests/agent_tool_selection/ -v --model haiku
Date Model Parameters Score
2026-03-16 Claude Opus 4.6 Not disclosed 7/7 (100%)
2026-03-16 Claude Sonnet 4.6 Not disclosed 7/7 (100%)
2026-03-16 Claude Haiku 4.5 Not disclosed 7/7 (100%)
2026-03-16 Claude 3 Haiku ~20B (est.) 7/7 (100%)

See tests/agent_tool_selection/RESULTS.md for full methodology and detailed results across prompt modes.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentic_hdf5-1.0.0rc0.tar.gz (45.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentic_hdf5-1.0.0rc0-py3-none-any.whl (49.7 kB view details)

Uploaded Python 3

File details

Details for the file agentic_hdf5-1.0.0rc0.tar.gz.

File metadata

  • Download URL: agentic_hdf5-1.0.0rc0.tar.gz
  • Upload date:
  • Size: 45.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agentic_hdf5-1.0.0rc0.tar.gz
Algorithm Hash digest
SHA256 95b38b1e1b9813dcd0332eaf1002272c2d871989b4f71433682051688f7cdbfe
MD5 bac078af50efe79b63f04ebf5d8a135a
BLAKE2b-256 ce9337a09d60dac85c62a986bd8556e7d38d1b663cab57ab45fbc2121cc40c45

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentic_hdf5-1.0.0rc0.tar.gz:

Publisher: publish-pypi.yml on mattjala/agentic-hdf5

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentic_hdf5-1.0.0rc0-py3-none-any.whl.

File metadata

File hashes

Hashes for agentic_hdf5-1.0.0rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 d9c1218bcbc63692c4fc11ee37392479b0c851b1b053d167e4f49f672cd734af
MD5 57ac0bc93c0992bd43c75098c4b896f9
BLAKE2b-256 c20922d25dd47d87af4e7c7c4d0dd1892a17fc6d6af2ae501ed7b1b4d060164b

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentic_hdf5-1.0.0rc0-py3-none-any.whl:

Publisher: publish-pypi.yml on mattjala/agentic-hdf5

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page