Unified API for NumPy, CuPy, PyTorch, TensorFlow, JAX, and pyclesperanto with automatic memory type conversion
Project description
arraybridge
Unified API for NumPy, CuPy, PyTorch, TensorFlow, JAX, and pyclesperanto
Features
- Unified API: Single interface for 6 array/tensor frameworks
- Automatic Conversion: DLPack + NumPy fallback with automatic path selection
- Declarative Decorators:
@numpy,@torch,@cupyfor memory type declarations - Device Management: Thread-local GPU contexts and automatic stream management
- OOM Recovery: Automatic out-of-memory detection and cache clearing
- Dtype Preservation: Automatic dtype preservation across conversions
- Zero Dependencies: Only requires NumPy (framework dependencies are optional)
Quick Start
from arraybridge import convert_memory, detect_memory_type
import numpy as np
# Create NumPy array
data = np.array([[1, 2], [3, 4]])
# Convert to PyTorch (if installed)
torch_data = convert_memory(data, source_type='numpy', target_type='torch', gpu_id=0)
# Detect memory type
mem_type = detect_memory_type(torch_data) # 'torch'
Declarative Decorators
from arraybridge import numpy, torch, cupy
@torch(input_type='numpy', output_type='torch', oom_recovery=True)
def my_gpu_function(data):
"""Automatically converts input from NumPy to PyTorch."""
return data * 2
# Use with NumPy input
result = my_gpu_function(np.array([1, 2, 3])) # Returns PyTorch tensor
Installation
# Base installation (NumPy only)
pip install arraybridge
# With specific frameworks
pip install arraybridge[torch]
pip install arraybridge[cupy]
pip install arraybridge[tensorflow]
pip install arraybridge[jax]
pip install arraybridge[pyclesperanto]
# With all frameworks
pip install arraybridge[all]
Supported Frameworks
| Framework | CPU | GPU | DLPack | Notes |
|---|---|---|---|---|
| NumPy | ✅ | ❌ | ❌ | Base framework |
| CuPy | ❌ | ✅ | ✅ | CUDA arrays |
| PyTorch | ✅ | ✅ | ✅ | Tensors |
| TensorFlow | ✅ | ✅ | ✅ | Tensors |
| JAX | ✅ | ✅ | ✅ | Arrays |
| pyclesperanto | ❌ | ✅ | ❌ | OpenCL arrays |
Why arraybridge?
Before (Manual conversion hell):
import numpy as np
import torch
import cupy as cp
def process_data(data, target='torch'):
if target == 'torch':
if isinstance(data, np.ndarray):
return torch.from_numpy(data).cuda()
elif isinstance(data, cp.ndarray):
return torch.as_tensor(data, device='cuda')
elif target == 'cupy':
if isinstance(data, np.ndarray):
return cp.asarray(data)
elif hasattr(data, '__cuda_array_interface__'):
return cp.asarray(data)
# ... 30 more lines of if/elif ...
After (arraybridge):
from arraybridge import convert_memory, detect_memory_type
def process_data(data, target='torch'):
source = detect_memory_type(data)
return convert_memory(data, source_type=source, target_type=target, gpu_id=0)
Advanced Features
Thread-Local GPU Streams
from arraybridge import torch
@torch(oom_recovery=True)
def parallel_processing(data):
# Automatically uses thread-local CUDA stream
# Enables true parallelization across threads
return data * 2
OOM Recovery
from arraybridge import cupy
@cupy(oom_recovery=True)
def memory_intensive_operation(data):
# Automatically catches OOM errors
# Clears GPU cache and retries
return data @ data.T
Stack Utilities
from arraybridge import stack_slices, unstack_slices
# Stack 2D slices into 3D array
slices_2d = [np.random.rand(100, 100) for _ in range(50)]
volume_3d = stack_slices(slices_2d, memory_type='torch', gpu_id=0)
# Unstack 3D array into 2D slices
slices_back = unstack_slices(volume_3d, memory_type='torch', gpu_id=0)
Documentation
Full documentation available at arraybridge.readthedocs.io
Performance
arraybridge uses DLPack for zero-copy conversions when possible:
| Conversion | Method | Speed |
|---|---|---|
| NumPy → PyTorch | torch.from_numpy() |
Zero-copy |
| PyTorch → CuPy | DLPack | Zero-copy |
| CuPy → JAX | DLPack | Zero-copy |
| NumPy → CuPy | Copy | Fast |
| PyTorch → NumPy | .numpy() |
Zero-copy (CPU) |
License
MIT License - see LICENSE file for details
Contributing
Contributions welcome! Please see CONTRIBUTING.md for guidelines.
Credits
Developed by Tristan Simas as part of the OpenHCS project.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file arraybridge-0.2.9.tar.gz.
File metadata
- Download URL: arraybridge-0.2.9.tar.gz
- Upload date:
- Size: 119.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d88addd95e9ae388e2d83f92b3b32234af76b566ce656948407c412ea419ce0a
|
|
| MD5 |
7435d457892fac6d61a1f66e2c719cfd
|
|
| BLAKE2b-256 |
1770751d2687283fbfdabf105005431d29d2eb2ee570a3c80925dc9fb4dd4679
|
Provenance
The following attestation bundles were made for arraybridge-0.2.9.tar.gz:
Publisher:
publish.yml on OpenHCSDev/ArrayBridge
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arraybridge-0.2.9.tar.gz -
Subject digest:
d88addd95e9ae388e2d83f92b3b32234af76b566ce656948407c412ea419ce0a - Sigstore transparency entry: 884293184
- Sigstore integration time:
-
Permalink:
OpenHCSDev/ArrayBridge@0e29e6d0d9bc9087ee44dcdb0467765c9a776120 -
Branch / Tag:
refs/tags/v0.2.9 - Owner: https://github.com/OpenHCSDev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@0e29e6d0d9bc9087ee44dcdb0467765c9a776120 -
Trigger Event:
push
-
Statement type:
File details
Details for the file arraybridge-0.2.9-py3-none-any.whl.
File metadata
- Download URL: arraybridge-0.2.9-py3-none-any.whl
- Upload date:
- Size: 32.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02f41ac61c0c1bf1e2d86780226b6cc5aac161502ebb81179a8bb1d1033214a3
|
|
| MD5 |
c38e03ee5ca28212f65a3190eda9a01d
|
|
| BLAKE2b-256 |
4f5817884d1fc4d8d8802c5ff0b029f557ecc1fb54ebd907a0d5b450fe7d0780
|
Provenance
The following attestation bundles were made for arraybridge-0.2.9-py3-none-any.whl:
Publisher:
publish.yml on OpenHCSDev/ArrayBridge
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arraybridge-0.2.9-py3-none-any.whl -
Subject digest:
02f41ac61c0c1bf1e2d86780226b6cc5aac161502ebb81179a8bb1d1033214a3 - Sigstore transparency entry: 884293282
- Sigstore integration time:
-
Permalink:
OpenHCSDev/ArrayBridge@0e29e6d0d9bc9087ee44dcdb0467765c9a776120 -
Branch / Tag:
refs/tags/v0.2.9 - Owner: https://github.com/OpenHCSDev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@0e29e6d0d9bc9087ee44dcdb0467765c9a776120 -
Trigger Event:
push
-
Statement type: