Pull and standardize data on cloud compute resources.
Project description
Spare Cores Crawler
SC Crawler is a Python package to pull and standardize data on cloud compute resources, with tooling to help organize and update the collected data into databases.
Installation
Stable version from PyPI:
pip install sparecores-crawler
Most recent version from GitHub:
pip install "sparecores-crawler @ git+https://git@github.com/SpareCores/sc-crawler.git"
References
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
sparecores_crawler-0.1.2.tar.gz
(64.0 kB
view hashes)
Built Distribution
Close
Hashes for sparecores_crawler-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a0c0fb4dd1f66bd5e543725c84aefd0b2d3e6169b70de70ad563c741796ea0ff |
|
MD5 | 560055e2e4a08b30e85d96fa1208630c |
|
BLAKE2b-256 | 1adc788995c4f24ef4bffc6ef9374c2079046b0eb0ca0769f7477a05cc1b3e6f |