Skip to main content

TSDB (Time Series Data Beans): a Python toolbox helping load 172 open-source time-series datasets

Project description

Welcome to TSDB

load 172 public time-series datasets with a single line of code ;-)

Python version the latest release version BSD-3 license Community GitHub contributors GitHub Repo stars GitHub Repo forks Code Climate maintainability Coveralls report GitHub Testing arXiv DOI Conda downloads PyPI downloads

📣 TSDB now supports a total of 1️⃣7️⃣2️⃣ time-series datasets ‼️

TSDB is a part of PyPOTS (a Python toolbox for data mining on Partially-Observed Time Series), and was separated from PyPOTS for decoupling datasets from learning algorithms.

TSDB is created to help researchers and engineers get rid of data collecting and downloading, and focus back on data processing details. TSDB provides all-in-one-stop convenience for downloading and loading open-source time-series datasets (available datasets listed below).

❗️Please note that due to people have very different requirements for data processing, data-loading functions in TSDB only contain the most general steps (e.g. removing invalid samples) and won't process the data (not even normalize it). So, no worries, TSDB won't affect your data preprocessing. If you only want the raw datasets, TSDB can help you download and save raw datasets as well (take a look at Usage Examples below).

🤝 If you need TSDB to integrate an open-source dataset or want to add it into TSDB yourself, please feel free to request for it by creating an issue or make a PR to merge your code.

🤗 Please star this repo to help others notice TSDB if you think it is a useful toolkit. Please properly cite TSDB and PyPOTS in your publications if it helps with your research. This really means a lot to our open-source research. Thank you!

❖ Usage Examples

[!IMPORTANT] TSDB is available on both and ❗️

Install via pip:

pip install tsdb

or install from source code:

pip install https://github.com/WenjieDu/TSDB/archive/main.zip

or install via conda:

conda install tsdb -c conda-forge

import tsdb

# list all available datasets in TSDB
tsdb.list()
# ['physionet_2012',
#  'physionet_2019',
#  'electricity_load_diagrams',
#  'beijing_multisite_air_quality',
#  'italy_air_quality',
#  'vessel_ais',
#  'electricity_transformer_temperature',
#  'pems_traffic',
#  'solar_alabama',
#  'ucr_uea_ACSF1',
#  'ucr_uea_Adiac',
#  ...

# select the dataset you need and load it, TSDB will download, extract, and process it automatically
data = tsdb.load('physionet_2012')
# if you need the raw data, use download_and_extract()
tsdb.download_and_extract('physionet_2012', './save_it_here')
# datasets you once loaded are cached, and you can check them with list_cached_data()
tsdb.list_cache()
# you can delete only one specific dataset's pickled cache
tsdb.delete_cache(dataset_name='physionet_2012', only_pickle=True)
# you can delete only one specific dataset raw files and preserve others
tsdb.delete_cache(dataset_name='physionet_2012')
# or you can delete all cache with delete_cached_data() to free disk space
tsdb.delete_cache()

# The default cache directory is ~/.pypots/tsdb under the user's home directory.
# To avoid taking up too much space if downloading many datasets ,
# TSDB cache directory can be migrated to an external disk
tsdb.migrate_cache("/mnt/external_disk/TSDB_cache")

That's all. Simple and efficient. Enjoy it! 😃

❖ List of Available Datasets

Name Main Tasks
PhysioNet Challenge 2012 Forecasting, Imputation, Classification
PhysioNet Challenge 2019 Forecasting, Imputation, Classification
Beijing Multi-Site Air-Quality Forecasting, Imputation
Italy Air Quality Forecasting, Imputation
Electricity Load Diagrams Forecasting, Imputation
Electricity Transformer Temperature (ETT) Forecasting, Imputation
Vessel AIS Forecasting, Imputation, Classification
PeMS Traffic Forecasting, Imputation
Solar Alabama Forecasting, Imputation
UCR & UEA Datasets (all 163 datasets) Classification

❖ Citing TSDB/PyPOTS

The paper introducing PyPOTS is available on arXiv, A short version of it is accepted by the 9th SIGKDD international workshop on Mining and Learning from Time Series (MiLeTS'23)). Additionally, PyPOTS has been included as a PyTorch Ecosystem project. We are pursuing to publish it in prestigious academic venues, e.g. JMLR (track for Machine Learning Open Source Software). If you use PyPOTS in your work, please cite it as below and 🌟star this repository to make others notice this library. 🤗

There are scientific research projects using PyPOTS and referencing in their papers. Here is an incomplete list of them.

@article{du2023pypots,
title={{PyPOTS: a Python toolbox for data mining on Partially-Observed Time Series}},
author={Wenjie Du},
journal={arXiv preprint arXiv:2305.18811},
year={2023},
}

or

Wenjie Du. PyPOTS: a Python toolbox for data mining on Partially-Observed Time Series. arXiv, abs/2305.18811, 2023.

🏠 Visits

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tsdb-0.6.2.tar.gz (28.7 kB view details)

Uploaded Source

Built Distribution

tsdb-0.6.2-py3-none-any.whl (32.3 kB view details)

Uploaded Python 3

File details

Details for the file tsdb-0.6.2.tar.gz.

File metadata

  • Download URL: tsdb-0.6.2.tar.gz
  • Upload date:
  • Size: 28.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.9

File hashes

Hashes for tsdb-0.6.2.tar.gz
Algorithm Hash digest
SHA256 def588023ddc205f4c9d5baa8942e3eeaf2d6b561310f59573ae2539ec05da9d
MD5 55a7ca7e757130cba8d101e1dcc7f363
BLAKE2b-256 51b24d54c013d5203765b145ae83452be4a10854af7173145a316eb11a19d269

See more details on using hashes here.

File details

Details for the file tsdb-0.6.2-py3-none-any.whl.

File metadata

  • Download URL: tsdb-0.6.2-py3-none-any.whl
  • Upload date:
  • Size: 32.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.9

File hashes

Hashes for tsdb-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9bae7dbe0db2da59bdcc6845a69411880453987b4afac79589c4aaa28e4ec90a
MD5 517bc197b028dec63e94641e0bdece2a
BLAKE2b-256 47d385153d802c68539113e5deffcf23ab0e08660aa283e04d45c943b7c13d01

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page