Cache PyTorch module outputs on the fly
Project description
torchcache
Effortlessly cache PyTorch module outputs or PyTorch-heavy functions on-the-fly with torchcache.
Particularly useful for caching and serving the outputs of computationally expensive large, pre-trained PyTorch modules, such as vision transformers. Note that gradients will not flow through the cached outputs.
Features
- Cache PyTorch module outputs or pure Python functions either in-memory or persistently to disk.
- Simple decorator-based interface for easy usage.
- Uses an MRU (most-recently-used) cache, which evicts the most recently used items first to manage memory/disk usage in a training setting. Learn more about MRU caches here.
Installation
pip install torchcache
Citation
If you use our work, please consider citing our paper:
@inproceedings{akbiyik2023routeformer,
title={Leveraging Driver Field-of-View for Multimodal Ego-Trajectory Prediction},
author={M. Eren Akbiyik, Nedko Savov, Danda Pani Paudel, Nikola Popovic, Christian Vater, Otmar Hilliges, Luc Van Gool, Xi Wang},
booktitle={International Conference on Learning Representations},
year={2025}
}
Basic usage
Quickly cache the output of your PyTorch module with a single decorator:
from torchcache import torchcache
@torchcache()
class MyModule(nn.Module):
def __init__(self):
super().__init__()
self.linear = nn.Linear(10, 10)
def forward(self, x):
# This output will be cached
return self.linear(x)
input_tensor = torch.ones(10, dtype=torch.float32)
# Output is cached during the first call...
output = model(input_tensor)
# ...and is retrieved from the cache for the next one
output_cached = model(input_tensor)
You can also cache the output of any function, not just PyTorch modules:
from torchcache import torchcache
@torchcache()
def my_function(x):
# This output will be cached
return x * 2
See documentation at torchcache.readthedocs.io for more examples.
Assumptions
To ensure seamless operation, torchcache assumes the following:
- Your module is a subclass of
nn.Module. - The module's forward method accepts any number of positional or keyword arguments with shapes
(B, *), whereBis the batch size and*represents any number of dimensions, or any other basic immutable Python types (int, str, float, boolean). All tensors should be on the same device and have the same dtype. - The forward method returns a single tensor of shape
(B, *).
Contribution
- Ensure you have Python installed.
- Install
poetry. - Run
poetry installto set up dependencies. - Run
poetry run pre-commit installto install pre-commit hooks. - Create a branch, make your changes, and open a pull request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchcache-0.6.0.tar.gz.
File metadata
- Download URL: torchcache-0.6.0.tar.gz
- Upload date:
- Size: 13.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.9.22 Linux/6.8.0-1021-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
845f5632e552669b4b93de4491091dc238a2063e6902f83398101c1e3bf0d26f
|
|
| MD5 |
1c107eab9515531198eba03e1b10020e
|
|
| BLAKE2b-256 |
15ff50180f21b72ec80ec113870f91adc5dcfe5578da88f2ece8839bb262b4e0
|
File details
Details for the file torchcache-0.6.0-py3-none-any.whl.
File metadata
- Download URL: torchcache-0.6.0-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.9.22 Linux/6.8.0-1021-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7754b5bec3c328b5c54db17704fb1cb887b5f132ead9905c25fc47f666c4d63c
|
|
| MD5 |
03638999fd7a6c304dc1492736ad5ef5
|
|
| BLAKE2b-256 |
78223bbe4e0e7c2d7523476d51e387e5cb0fab59e0581bdf9a295601d6745e78
|