Skip to main content

Timeseries Learning Library for PyTorch.

Project description

PyPI Version Docs Status

pytorch_timeseries

An all in one deep learning library that boost your timeseries research. Check the documentation for more detail.

Compared to previous libraries, pytorch_timeseries is

  • dataset automatically downloaded
  • easy to use and extend
  • clear documentation
  • highly customizable
  • install and run!
  • ..........

1. installation

pip install torch-timeseries

⚠️⚠️⚠️ Warning: We only support python version >= 3.8+

2. Running Implemented Experiments

Forecast

# running DLinear Forecast on dataset ETTh1 with seed = 3 
pytexp --model DLinear --task Forecast --dataset_type ETTh1 run 3
# running DLinear Forecast on dataset ETTh1 with seeds=[1,2,3]
pytexp --model DLinear --task Forecast --dataset_type ETTh1 runs '[1,2,3]'

Imputation

# running DLinear Imputation on dataset ETTh1 with seed = 3 
pytexp --model DLinear --task Imputation --dataset_type ETTh1 run 3
# running DLinear Imputation on dataset ETTh1 with seed = [1,2,3] 
pytexp --model DLinear --task Imputation --dataset_type ETTh1 runs '[1,2,3]'

UEAClassification

# running DLinear UEAClassification on dataset EthanolConcentration with seed = 3 
pytexp --model DLinear --task UEAClassification --dataset_type EthanolConcentration run 3
# running DLinear UEAClassification on dataset EthanolConcentration with seed = [1,2,3] 
pytexp --model DLinear --task UEAClassification --dataset_type EthanolConcentration runs '[1,2,3]'

AnomalyDetection

# running DLinear AnomalyDetection on dataset MSL with seed = [1,2,3] 
pytexp --model DLinear --task AnomalyDetection --dataset_type MSL run 3
# running DLinear AnomalyDetection on dataset MSL with seed = [1,2,3] 
pytexp --model DLinear --task AnomalyDetection --dataset_type MSL runs 3

Development Milestones

Implemented Datasets

Full list of datasets can be found at Documentation.

Datasets Forecasting Imputation Anomaly Classification
ETTh1
ETTh2
ETTm1
ETTm2
......And More

Implemented Tasks

  • Forecast
  • Classfication (for UEA datasets)
  • Anomaly Detection
  • Imputation
  • You can fill this check box! (contribute to develop your own task!)

Implemented Models

Models Forecasting Imputation Anomaly Classification
Informer (2021)
Autoformer (2021)
FEDformer (2022)
DLinear (2022)
PatchTST (2022)

Customizing Your Own Pipeline

we provide examples of :

Detail of customize forecasting pipeline is as follows:

1 Forecasting

1.1 download dataset

The dataset will be downloaded automatically!!!!

from torch_timeseries.dataset import ETTh1
from torch_timeseries.dataloader import StandardScaler, SlidingWindow, SlidingWindowTS
from torch_timeseries.model import DLinear
from torch.nn import MSELoss, L1Loss
from torch.optim import Adam
dataset = ETTh1('./data')

1.2 setup scaler/dataloader

Once you setup a dataloader and pass a scaler into this dataloader, the scaler will be fitted on the training set.

scaler = StandardScaler()
dataloader = SlidingWindowTS(dataset, 
                        window=96,
                        horizon=1,
                        steps=336,
                        batch_size=32, 
                        train_ratio=0.7, 
                        val_ratio=0.2, 
                        scaler=scaler,
                        )

After this, you can access the train/val/test loader by dataloader.train_loader/val_loader/test_loader

1.3 training

model = DLinear(dataloader.window, dataloader.steps, dataset.num_features, individual= True)
optimizer = Adam(model.parameters())
loss_function = MSELoss()

# train
model.train()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.train_loader:
    optimizer.zero_grad()
    
    scaled_x = scaled_x.float()
    scaled_y = scaled_y.float()
    scaled_pred_y = model(scaled_x) 
    
    loss = loss_function(scaled_pred_y, scaled_y)
    loss.backward()
    optimizer.step()
    print(loss)

1.4 val/test

# val
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.val_loader:
    ....your validation code here...

# test
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.test_loader:
    ....your test code here...

Dev Install

install requirements

Note:This library assumes that you've installed Pytorch according to it's official website, the basic dependencies of torch > > related libraries may not be listed in the requirements files: https://pytorch.org/get-started/locally/

The recommended python version is 3.8.1+.

  1. fork this project

  2. clone this project (latest version)

git clone https://github.com/wayne155/pytorch_timeseries
  1. install requirements.
pip install -r ./requirements.txt
  1. change some code and push to the forked repo

  2. create a pull request to this repo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_timeseries-0.1.2.tar.gz (65.6 kB view details)

Uploaded Source

File details

Details for the file torch_timeseries-0.1.2.tar.gz.

File metadata

  • Download URL: torch_timeseries-0.1.2.tar.gz
  • Upload date:
  • Size: 65.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.16

File hashes

Hashes for torch_timeseries-0.1.2.tar.gz
Algorithm Hash digest
SHA256 f1bdb695716236064fd5e931eb033fcfb79e343db2c4f5bf919d7adef7f06582
MD5 0d6c347a0f1a29a4c76693df471f2ed3
BLAKE2b-256 d1310678afd8341923bde5fdd14638a435b4baaee1d93acaeed40fcd2f38828f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page