TPU utility with PyTorch XLA
Project description
# TPU_utility Utility for TPU with PyTorch XLA.
One more examples please check [this kaggle notebook](https://www.kaggle.com/wabinab/test-tpu-training).
We have move the packages to folder tpu_util. For installation, call !python -m pip install git+https://github.com/Wabinab/TPU_utility.git@main on colab for nightly releases.
Update: currently we have published the packages on PyPI for stable versions. You could install with pip install tpu-util.
Currently, if by any chance you are running TPU on Google Cloud Platform, the dependencies are not installed automatically. One doesn’t have any access to GCP so one isn’t sure how it works, and hence you are responsible for installing requirements.txt and all other requirements by yourself, or clone colab/kaggle environment to your machine and setup from there.
Sample contains data taken from [this](https://www.kaggle.com/wabinab/gnet-cqt-shannon-alt) dataset. This image is a spectrogram, it’s not easy to understand; however it is small enough to be use as test cases.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file tpu_util-0.1.3.tar.gz
.
File metadata
- Download URL: tpu_util-0.1.3.tar.gz
- Upload date:
- Size: 25.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.11
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fcbdb8797667bfe7604605f0fa382860ea4106247e2675666789fecdb5ac02e4 |
|
MD5 | ac65baa094eaf2de16af2ec1cab4a605 |
|
BLAKE2b-256 | d5435d24bf90a50c410c7c4f1ec787f3a76d475ae246ef02b328bf0dbb9db8d5 |