Structured data collected by sparecores-crawler.
Project description
Spare Cores Data
SC Data is a Python package and related tools making use of
sparecores-crawler
to pull and
standardize data on cloud compute resources. This repository actually
runs the crawler every 5 minutes to update spot prices, and every hour
to update all cloud resources in an internal SCD table and public
SQLite snapshot as well.
Installation
Stable version from PyPI:
pip install sparecores-data
Most recent version from GitHub:
pip install "sparecores-data @ git+https://git@github.com/SpareCores/sc-data.git"
Usage
For easy access to the SQLite database file, import the db
object
of the sc_data
Python package, which runs an updater thread in the
background to keep the SQLite file up-to-date:
from sc_data import db
print(db.path)
By default, the SQLite file will be updated every 600 seconds, which
can be overwritten by the sc_data_db_refresh_seconds
builtins
attribute or the SC_DATA_DB_REFRESH_SECONDS
environment variable.
References
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for sparecores_data-0.1.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | da655f5f5f106a8da5c0e3ab53664af8530bc78d32e600ea6a7a79fff4d15a2b |
|
MD5 | e1803d7dfd15d496259d5570af17b9e1 |
|
BLAKE2b-256 | dfeeb3ba73d689e708b487c519a2ef0b8136f75bb91d27368369f1f6f0bd31fc |