Structured data collected by sparecores-crawler.
Project description
Spare Cores Data
SC Data is a Python package and related tools making use of
sparecores-crawler
to pull and
standardize data on cloud compute resources. This repository actually
runs the crawler every 5 minutes to update spot prices, and every hour
to update all cloud resources in an internal SCD table and public
SQLite snapshot as well.
Installation
Stable version from PyPI:
pip install sparecores-data
Most recent version from GitHub:
pip install "sparecores-data @ git+https://git@github.com/SpareCores/sc-data.git"
Usage
For easy access to the SQLite database file, import the db
object
of the sc_data
Python package, which runs an updater thread in the
background to keep the SQLite file up-to-date:
from sc_data import db
print(db.path)
By default, the SQLite file will be updated every 600 seconds, which
can be overwritten by the sc_data_db_refresh_seconds
builtins
attribute or the SC_DATA_DB_REFRESH_SECONDS
environment variable.
References
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for sparecores_data-0.2.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 80b783c8aff413022a516c6faaf8913d38c8f5b258433886c6f7c50aa57b20b4 |
|
MD5 | 9827c75360e17c30642966e97e7584f3 |
|
BLAKE2b-256 | 01a03bf1c9ba514f45b50432eea029dd1cb4dfcec323324597beae545e220219 |