Skip to main content

Timeseries Learning Library for PyTorch.

Project description

PyPI Version Docs Status

pytorch_timeseries

An all in one deep learning library that boost your timeseries research. Check the documentation for more detail.

Compared to previous libraries, pytorch_timeseries is

  • dataset automatically downloaded
  • easy to use and extend
  • clear documentation
  • highly customizable
  • ..........

installation

pip install torch-timeseries

⚠️⚠️⚠️ Warning: We only support python version >= 3.8+

addtional install

For running Graph Nerual Network based models, pytorch_geometric is also needed.

pip install torch_geometric

# Optional dependencies
pip install pyg_lib torch_scatter torch_sparse torch_cluster torch_spline_conv -f https://data.pyg.org/whl/torch-2.0.0+cu118.html

Quick Start

1 Forecasting

1.1 download dataset

The dataset will be downloaded automatically!!!!

from torch_timeseries.dataset import ETTh1
from torch_timeseries.dataloader import StandardScaler, SlidingWindow, SlidingWindowTS
from torch_timeseries.model import DLinear
from torch.nn import MSELoss, L1Loss
from torch.optim import Adam
dataset = ETTh1('./data')

1.2 setup scaler/dataloader

Once you setup a dataloader and pass a scaler into this dataloader, the scaler will be fitted on the training set.

scaler = StandardScaler()
dataloader = SlidingWindowTS(dataset, 
                        window=96,
                        horizon=1,
                        steps=336,
                        batch_size=32, 
                        train_ratio=0.7, 
                        val_ratio=0.2, 
                        scaler=scaler,
                        )

After this, you can access the train/val/test loader by dataloader.train_loader/val_loader/test_loader

1.3 training

model = DLinear(dataloader.window, dataloader.steps, dataset.num_features, individual= True)
optimizer = Adam(model.parameters())
loss_function = MSELoss()

# train
model.train()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.train_loader:
    optimizer.zero_grad()
    
    scaled_x = scaled_x.float()
    scaled_y = scaled_y.float()
    scaled_pred_y = model(scaled_x) 
    
    loss = loss_function(scaled_pred_y, scaled_y)
    loss.backward()
    optimizer.step()
    print(loss)

1.4 val/test

# val
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.val_loader:
    ....your validation code here...

# test
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.test_loader:
    ....your test code here...

2 Imputation

1. download dataset

The dataset will be downloaded automatically!!!!

from torch_timeseries.dataset import ETTh1
from torch_timeseries.dataloader import StandardScaler, SlidingWindow, SlidingWindowTS
from torch_timeseries.model import DLinear
from torch.nn import MSELoss, L1Loss
from torch.optim import Adam
dataset = ETTh1('./data')

2. setup scaler/dataloader

Once you setup a dataloader and pass a scaler into this dataloader, the scaler will be fitted on the training set.

scaler = StandardScaler()
dataloader = SlidingWindowTS(dataset, 
                        window=96,
                        horizon=1,
                        steps=336,
                        batch_size=32, 
                        train_ratio=0.7, 
                        val_ratio=0.2, 
                        scaler=scaler,
                        )

After this, you can access the train/val/test loader by dataloader.train_loader/val_loader/test_loader

3. training

model = DLinear(dataloader.window, dataloader.steps, dataset.num_features, individual= True)
optimizer = Adam(model.parameters())
loss_function = MSELoss()

# train
model.train()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.train_loader:
    optimizer.zero_grad()
    
    scaled_x = scaled_x.float()
    scaled_y = scaled_y.float()
    scaled_pred_y = model(scaled_x) 
    
    loss = loss_function(scaled_pred_y, scaled_y)
    loss.backward()
    optimizer.step()
    print(loss)

4. val/test

# val
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.val_loader:
    scaled_x = scaled_x.float()
    scaled_y = scaled_y.float()
    scaled_pred_y = model(scaled_x) 
    loss = loss_function(scaled_pred_y, scaled_y)
    

# test
model.eval()
for scaled_x, scaled_y, x, y, x_date_enc, y_date_enc in dataloader.test_loader:
    scaled_x = scaled_x.float()
    scaled_y = scaled_y.float()
    scaled_pred_y = model(scaled_x) 
    loss = loss_function(scaled_pred_y, scaled_y)
    

dev install

install requirements

Note:This library assumes that you've installed Pytorch according to it's official website, the basic dependencies of torch > > related libraries may not be listed in the requirements files: https://pytorch.org/get-started/locally/

The recommended python version is 3.8.1+. Please first install torch according to your environment.

pip3 install torch torchvision torchaudio

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_timeseries-0.0.3a1.tar.gz (30.5 kB view details)

Uploaded Source

Built Distribution

torch_timeseries-0.0.3a1-py3-none-any.whl (46.1 kB view details)

Uploaded Python 3

File details

Details for the file torch_timeseries-0.0.3a1.tar.gz.

File metadata

  • Download URL: torch_timeseries-0.0.3a1.tar.gz
  • Upload date:
  • Size: 30.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.16

File hashes

Hashes for torch_timeseries-0.0.3a1.tar.gz
Algorithm Hash digest
SHA256 d465cb586d8a8d71353133f4133828efa7bd9f4af7d68a2477bb4a19a2f8f732
MD5 259c7f90cac675cdcfa673ba8e5d9431
BLAKE2b-256 8d4083e272479fac5bc81ca0987959b73e263f5cd9180cf76c002a572644184f

See more details on using hashes here.

File details

Details for the file torch_timeseries-0.0.3a1-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_timeseries-0.0.3a1-py3-none-any.whl
Algorithm Hash digest
SHA256 8199fb7fed45b4408c8b12de3e50db407adac44fa68cd9e044c91b464cf5a28d
MD5 d8c5873f73874952dd04079167440ef3
BLAKE2b-256 c99058dcb36c31066715684c742a97a62f99c017d62ad6ae49e7a34f04870349

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page