Utilities for GPU activation and data transfer in PyTorch.
Project description
get_gpu
get_gpu is a Python package that simplifies GPU management in PyTorch projects. It helps users activate a GPU (if available) and move data to GPU memory for faster training and inference.
Features
- Automatically detect and use the GPU if available, else fallback to CPU.
- Transfer tensors or datasets to the selected device.
- Wrap PyTorch DataLoader to seamlessly move data to the GPU during iteration.
Installation
Install the package directly from PyPI:
pip install get-gpu
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
gpu-active-0.1.1.tar.gz
(2.9 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file gpu-active-0.1.1.tar.gz.
File metadata
- Download URL: gpu-active-0.1.1.tar.gz
- Upload date:
- Size: 2.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
35a8231803bc87bc5c27922f0710c99b08aa18f2a82b28b07cd1fc58a9441dd7
|
|
| MD5 |
6fd0ba990efb208e5cf421a016947f7b
|
|
| BLAKE2b-256 |
00a46a01b0711d5bb91b5fe01e7d3094280d23a946fe31f06480ee37421ff8c8
|
File details
Details for the file gpu_active-0.1.1-py3-none-any.whl.
File metadata
- Download URL: gpu_active-0.1.1-py3-none-any.whl
- Upload date:
- Size: 3.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1a2b21a23f8e3ede33994fe72505f19832d86ece3dc1bfb110bd55827adb7b29
|
|
| MD5 |
f3f0f2036fdb40ee34ad510ad09531d2
|
|
| BLAKE2b-256 |
8802b9a70903beb59af2fcc08f12fb83744958561c3d8e1e05b1b6ae865d46a5
|