Dataloader with concurrency improvements (from https://github.com/iarai/concurrent-dataloader)
Project description
PytorchConcurrentDataloader
Minimal version of the ConcurrentDataloader repository published to pip.
Setup
pip install pytorch-concurrent-dataloader
Usage
- replace
torch.utils.data.DataLoader
withpytorch_concurrent_dataloader.DataLoader
- pass new parameters for concurrent dataloading
from pytorch_concurrent_dataloader import DataLoader
dataloader = DataLoader(
# pass old parameters as usual
dataset=...,
batch_size=...,
num_workers=...,
# pass new parameters
num_fetch_workers=...,
)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for pytorch_concurrent_dataloader-0.0.5.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 891d27e10fc1d332cea83745cd81b87ee12835c0ae50d59e2ec39da12eb046dd |
|
MD5 | fed93f9511ce43fd3b64ba88ce2fde5a |
|
BLAKE2b-256 | 90dd1b5775d661e1d849ffb7c5161b9894ca76cbe61034fc17e85b22b05304b2 |
Close
Hashes for pytorch_concurrent_dataloader-0.0.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 78a92e7998b327f014a4dff77c671f511ea9e90e749a178980d67fcabcbd8f9e |
|
MD5 | 9f8bf8bc53648d102fcbbe09ca08c556 |
|
BLAKE2b-256 | 8b73bc515b89318cee5ce7d1b188f6b7fc8c9b8dabee7b459917abda90c8a822 |