TPU utility with PyTorch XLA
Project description
# TPU_utility Utility for TPU with PyTorch XLA.
One more examples please check [this kaggle notebook](https://www.kaggle.com/wabinab/test-tpu-training).
We have move the packages to folder tpu_util. For installation, call !python -m pip install git+https://github.com/Wabinab/TPU_utility.git@main on colab.
Currently, if by any chance you are running TPU on Google Cloud Platform, the dependencies are not installed automatically. One doesn’t have any access to GCP so one isn’t sure how it works, and hence you are responsible for installing requirements.txt and all other requirements by yourself, or clone colab/kaggle environment to your machine and setup from there.
Sample contains data taken from [this](https://www.kaggle.com/wabinab/gnet-cqt-shannon-alt) dataset. This image is a spectrogram, it’s not easy to understand; however it is small enough to be use as test cases.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file tpu_util-0.1.0.tar.gz
.
File metadata
- Download URL: tpu_util-0.1.0.tar.gz
- Upload date:
- Size: 18.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.11
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 109dbe21294e9243caf3a6774e85df6ce36a3198a13807c34db7ce32e8363f52 |
|
MD5 | 9584408bb851097d026b4dbdfd5e033d |
|
BLAKE2b-256 | 89883a0a4d8629a94143f2dff497b03d87d3be1bdb3ab3ebe84b1d1a13bb18ea |