Unified interface for IOWarp: high-performance I/O runtime and AI agent toolkit
Project description
IOWarp
Context Management Platform
Enabling AI agents to orchestrate large-scale data, complex multi-step workflows, and autonomous agentic orchestration.
Overview
IOWarp is a context management platform designed to accelerate scientific workflows by solving data bottlenecks using AI. It enables AI agents to orchestrate large-scale data, complex multi-step workflows, and autonomous agentic orchestration in high-performance computing environments.
This repository provides unified installation methods and tools for the entire IOWarp ecosystem. It simplifies the deployment of IOWarp's platform components - including the Content Assimilation Engine (CAE), Content Transfer Engine (CTE), Runtime, Agent Toolkit, and MCP servers - across multiple platforms and package managers.
Key Capabilities
- Context Engineering: 15 specialized MCP servers for scientific computing workflows, ClaudIO agent framework, and intelligent context orchestration
- High Performance: Demonstrated 7.5x speedup in real-world workflows with HPC integration and efficient resource management
- Open Source: BSD 3-Clause licensed, $5M NSF funded, with active community support
- Three-Tier Architecture: Intelligence Layer (AI agents), Tool Layer (data processing), and Storage Layer (hierarchical storage management)
Installation
📦 PyPI (Recommended for Python Users)
Install IOWarp and all its components via pip or uv:
# Using pip
pip install iowarp
# Using uv (faster)
uv pip install iowarp
# As a CLI tool with uvx (no installation needed)
uvx iowarp
# Install as a persistent tool
uv tool install iowarp
This installs:
- iowarp-core: High-performance I/O runtime and data processing engine (automatic fallback to GitHub releases)
- iowarp-agent-toolkit: AI agent tools and 15+ MCP servers for scientific computing
- Unified CLI: Single
iowarpcommand to access all functionality
Automatic Installation Fallback:
The iowarp package includes intelligent automatic installation for iowarp-core:
- First attempt: Checks if iowarp-core is already installed (e.g., from PyPI wheels when available)
- Automatic fallback: If not found, automatically downloads and installs compatible wheels from GitHub releases
- Seamless experience: Works transparently on first
import iowarp- no manual intervention needed
This ensures:
- ✅ Installation always succeeds with
pip install iowarp - ✅ Works on all platforms (Linux x86_64, ARM64) even when PyPI wheels are unavailable
- ✅ Automatic updates when PyPI wheels become available (future)
- ✅ No build tools required for most users
Platform Compatibility:
- Linux x86_64 (manylinux_2_17)
- Linux ARM64/aarch64 (manylinux_2_17)
- Python 3.10, 3.11, 3.12, 3.13
If automatic installation fails, you can manually install iowarp-core from GitHub releases:
# Find your Python version
python --version # e.g., Python 3.10.x
# Install matching wheel from GitHub releases
# Replace cp310 with your Python version (cp310, cp311, cp312, cp313)
# Replace x86_64 with aarch64 for ARM systems
pip install https://github.com/iowarp/core/releases/download/v0.6.2/iowarp_core-0.6.2-cp310-cp310-manylinux_2_17_x86_64.whl
Quick Start:
# Start IOWarp runtime (default behavior)
iowarp
# Or explicitly
iowarp core start
# Stop runtime
iowarp core stop
# List available MCP servers
iowarp agent mcp-servers
# Run an MCP server
iowarp agent mcp-server hdf5
# List prompt templates
iowarp agent prompts
Note: iowarp-core is currently under active development. Some features may not be fully functional yet.
All individual commands remain available:
iowarp-core,wrp_start,wrp_stop, etc. (core runtime commands)iowarp-agent-toolkit(agent toolkit launcher)
⚡ Native Install
One way to install IOWarp is using our standalone installer script:
# Basic install (uses pip and expects a venv to be active)
curl -fsSL https://raw.githubusercontent.com/iowarp/iowarp-install/main/install.sh | bash
# Custom install location
curl -fsSL https://raw.githubusercontent.com/iowarp/iowarp-install/main/install.sh | INSTALL_PREFIX=$HOME/iowarp bash
# Install with build options
curl -fsSL https://raw.githubusercontent.com/iowarp/iowarp-install/main/install.sh | WRP_CORE_ENABLE_MPI=ON WRP_CORE_ENABLE_TESTS=ON WRP_CORE_ENABLE_BENCHMARKS=ON bash
Environment Variables:
INSTALL_PREFIX- Installation directory (default:/usr/local)WRP_CORE_ENABLE_MPI- Enable MPI support (default: unset, set toONto enable)WRP_CORE_ENABLE_TESTS- Build test suite (default: unset, set toONto enable)WRP_CORE_ENABLE_BENCHMARKS- Build benchmarks (default: unset, set toONto enable)
This will:
- Clone and build IOWarp core with all submodules
- Install the IOWarp agent toolkit
- Set up the complete IOWarp environment
🐳 Docker
Docker provides an alternative containerized approach. The iowarp/iowarp:latest image includes the complete runtime with buffering services.
- Pull the Docker image:
docker pull iowarp/iowarp:latest
- Download the
docker/user/docker-compose.ymlfile. Check file here:
wget https://raw.githubusercontent.com/iowarp/iowarp/main/docker/user/docker-compose.yml
- Run the container:
docker-compose up -d
[!NOTE] The provided
docker-compose.ymlfile already configures the required shared memory (shm_size: 8g) and shareable IPC namespace (ipc: shareable) settings. These are required for IOWarp to function properly.
More on docker:
Configuration (optional)
The default configuration provides up to 16GB buffer cache.
For more complexity, create a wrp_conf.yaml configuration file.
This is an example with some paramters, but not all:
# IOWarp Runtime Configuration File
compose:
# Context Transfer Engine (CTE) - handles data buffering and I/O
- mod_name: wrp_cte_core
pool_name: wrp_cte
pool_query: local
pool_id: 512.0
# Storage block device configuration
# This is the most important section - defines where data is buffered
storage:
# RAM-based storage tier (fastest)
- path: "ram::cte_ram_tier1"
bdev_type: "ram"
capacity_limit: "16GB"
score: 0.0 # Manual score override (range 0 to 1), put all data here
# Example: Add NVMe tier (uncomment to use)
# - path: "/dev/nvme0n1"
# bdev_type: "file"
# capacity_limit: "500GB"
# score: 0.5
# Example: Add SSD tier (uncomment to use)
# - path: "/dev/sda1"
# bdev_type: "file"
# capacity_limit: "1TB"
# score: 1.0
# Context Assimilation Engine (CAE) - handles data processing and transformation
- mod_name: wrp_cae_core
pool_name: cae_main
pool_query: local
pool_id: "400.0"
Storage Configuration:
path- Device path or RAM identifier (format:ram::<name>for RAM,/dev/<device>for block devices)bdev_type- Backend type:"ram"(memory),"nvme"(NVMe SSD),"aio"(async I/O for other block devices)capacity_limit- Maximum storage capacity (supportsKB,MB,GB,TBsuffixes)score- Tier priority (0.0 = lowest priority, 1.0 = highest). 0.0 means "anyone can put data here", while 1.0 means only put high priority data here.
Multiple storage tiers can be configured to create a hierarchical storage system. Data is automatically placed across tiers based on the data placement engine (DPE) strategy.
Example: Running Benchmarks
The demos/benchmark/ directory contains a complete Docker Compose setup for running CTE benchmarks:
cd demos/benchmark
# Run default benchmark (Put test)
docker-compose up
# Run specific test with custom parameters
TEST_CASE=Get IO_SIZE=4m IO_COUNT=1000 docker-compose up
Available benchmark parameters:
TEST_CASE- Benchmark test:Put,Get,PutGet(default:Put)NUM_PROCS- Number of parallel processes (default:1)DEPTH- Queue depth for concurrent operations (default:4)IO_SIZE- Size of each I/O operation with suffixb,k,m,g(default:1m)IO_COUNT- Number of operations to perform (default:100)
The benchmark compose file demonstrates:
- Separate runtime and benchmark services
- Shared memory configuration (
shm_size: 8g) - IPC namespace sharing for shared memory access
- Custom CTE configuration via volume mounts
- Health checks to ensure runtime readiness
📦 Spack
- Install Spack package manager - Installation guide
- Add IOWarp repository:
spack repo add iowarp-spack
- Install IOWarp:
spack install iowarp
Resources
- Gnosis Research Center: grc.iit.edu
- Website: iowarp.ai
- Platform Documentation: iowarp.ai/platform
- Docs: iowarp.ai/docs/intro
- Contributing: See CONTRIBUTING for guidelines
- License: BSD 3-Clause License
- Support: GitHub Issues | Project Homepage
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file iowarp-0.1.1.tar.gz.
File metadata
- Download URL: iowarp-0.1.1.tar.gz
- Upload date:
- Size: 39.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9716d3ff57766deac3d51249b2d8f08903f795d8f7ce3f75bccf19c700673323
|
|
| MD5 |
8434e910847f8534a6cb803b21244fcc
|
|
| BLAKE2b-256 |
92d6d8f019d04fe25217a8ed9b64977b13972e791c055863a7eb1a31a482e5c0
|
Provenance
The following attestation bundles were made for iowarp-0.1.1.tar.gz:
Publisher:
publish-pypi.yml on iowarp/iowarp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
iowarp-0.1.1.tar.gz -
Subject digest:
9716d3ff57766deac3d51249b2d8f08903f795d8f7ce3f75bccf19c700673323 - Sigstore transparency entry: 704355189
- Sigstore integration time:
-
Permalink:
iowarp/iowarp@41ba4466bd302e994ebc10b1ebe13eaf783ccf07 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/iowarp
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@41ba4466bd302e994ebc10b1ebe13eaf783ccf07 -
Trigger Event:
push
-
Statement type:
File details
Details for the file iowarp-0.1.1-py3-none-any.whl.
File metadata
- Download URL: iowarp-0.1.1-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
05c052954792db0844c5872c5307c3a00a67d128877a7956455580c9293fae34
|
|
| MD5 |
521e12fe2ed59c12d40cd8c9891be150
|
|
| BLAKE2b-256 |
263b34bd8f6d5bcc616bc46f300068b7dbeb1705f4b7cffd98e9a0677def0b4e
|
Provenance
The following attestation bundles were made for iowarp-0.1.1-py3-none-any.whl:
Publisher:
publish-pypi.yml on iowarp/iowarp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
iowarp-0.1.1-py3-none-any.whl -
Subject digest:
05c052954792db0844c5872c5307c3a00a67d128877a7956455580c9293fae34 - Sigstore transparency entry: 704355200
- Sigstore integration time:
-
Permalink:
iowarp/iowarp@41ba4466bd302e994ebc10b1ebe13eaf783ccf07 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/iowarp
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@41ba4466bd302e994ebc10b1ebe13eaf783ccf07 -
Trigger Event:
push
-
Statement type: