Skip to main content

Deep learning time series with TensorFlow

Project description


LICENSE PyPI Version Build Status Lint Status Docs Status Code Coverage Contributing

Documentation | Tutorials | Release Notes | 中文

TFTS (TensorFlow Time Series) is an easy-to-use time series package, supporting the classical and latest deep learning methods in TensorFlow or Keras.

  • Support sota models for time series tasks (prediction, classification, anomaly detection)
  • Provide advanced deep learning models for industry, research and competition
  • Documentation lives at time-series-prediction.readthedocs.io

Tutorial

Installation

  • python >= 3.7
  • tensorflow >= 2.4
pip install tfts

Quick start

Open In Colab Open in Kaggle

import matplotlib.pyplot as plt
import tensorflow as tf
import tfts
from tfts import AutoModel, AutoConfig, KerasTrainer

train_length = 24
predict_sequence_length = 8
(x_train, y_train), (x_valid, y_valid) = tfts.get_data("sine", train_length, predict_sequence_length, test_size=0.2)

model_name_or_path = 'seq2seq'  # 'wavenet', 'transformer', 'rnn', 'tcn', 'bert', 'dlinear', 'nbeats', 'informer', 'autoformer'
config = AutoConfig.for_model(model_name_or_path)
model = AutoModel.from_config(config, predict_sequence_length=predict_sequence_length)
trainer = KerasTrainer(model, optimizer=tf.keras.optimizers.Adam(0.0007))
trainer.train((x_train, y_train), (x_valid, y_valid), epochs=30)

pred = trainer.predict(x_valid)
trainer.plot(history=x_valid, true=y_valid, pred=pred)
plt.show()

Prepare your own data

You could train your own data by preparing 3D data as inputs, for both inputs and targets

  • option1 np.ndarray
  • option2 tf.data.Dataset

Encoder only model inputs

import numpy as np
from tfts import AutoConfig, AutoModel, KerasTrainer

train_length = 24
predict_sequence_length = 8
n_feature = 2

x_train = np.random.rand(1, train_length, n_feature)  # inputs: (batch, train_length, feature)
y_train = np.random.rand(1, predict_sequence_length, 1)  # target: (batch, predict_sequence_length, 1)
x_valid = np.random.rand(1, train_length, n_feature)
y_valid = np.random.rand(1, predict_sequence_length, 1)

config = AutoConfig.for_model('rnn')
model = AutoModel.from_config(config, predict_sequence_length=predict_sequence_length)
trainer = KerasTrainer(model)
trainer.train(train_dataset=(x_train, y_train), valid_dataset=(x_valid, y_valid), epochs=1)

Encoder-decoder model inputs

# option1: np.ndarray
import numpy as np
from tfts import AutoConfig, AutoModel, KerasTrainer

train_length = 24
predict_sequence_length = 8
n_encoder_feature = 2
n_decoder_feature = 3

x_train = (
    np.random.rand(1, train_length, 1),  # inputs: (batch, train_length, 1)
    np.random.rand(1, train_length, n_encoder_feature),  # encoder_feature: (batch, train_length, encoder_features)
    np.random.rand(1, predict_sequence_length, n_decoder_feature),  # decoder_feature: (batch, predict_sequence_length, decoder_features)
)
y_train = np.random.rand(1, predict_sequence_length, 1)  # target: (batch, predict_sequence_length, 1)

x_valid = (
    np.random.rand(1, train_length, 1),
    np.random.rand(1, train_length, n_encoder_feature),
    np.random.rand(1, predict_sequence_length, n_decoder_feature),
)
y_valid = np.random.rand(1, predict_sequence_length, 1)

config = AutoConfig.for_model("seq2seq")
model = AutoModel.from_config(config, predict_sequence_length=predict_sequence_length)
trainer = KerasTrainer(model)
trainer.train((x_train, y_train), (x_valid, y_valid), epochs=1)
# option2: tf.data.Dataset
import numpy as np
import tensorflow as tf
from tfts import AutoConfig, AutoModel, KerasTrainer

class FakeReader(object):
    def __init__(self, predict_sequence_length):
        train_length = 24
        n_encoder_feature = 2
        n_decoder_feature = 3
        self.x = np.random.rand(15, train_length, 1)
        self.encoder_feature = np.random.rand(15, train_length, n_encoder_feature)
        self.decoder_feature = np.random.rand(15, predict_sequence_length, n_decoder_feature)
        self.target = np.random.rand(15, predict_sequence_length, 1)

    def __len__(self):
        return len(self.x)

    def __getitem__(self, idx):
        return {
            "x": self.x[idx],
            "encoder_feature": self.encoder_feature[idx],
            "decoder_feature": self.decoder_feature[idx],
        }, self.target[idx]

    def iter(self):
        for i in range(len(self.x)):
            yield self[i]

predict_sequence_length = 10
train_reader = FakeReader(predict_sequence_length=predict_sequence_length)
train_loader = tf.data.Dataset.from_generator(
    train_reader.iter,
    ({"x": tf.float32, "encoder_feature": tf.float32, "decoder_feature": tf.float32}, tf.float32),
)
train_loader = train_loader.batch(batch_size=1)
valid_reader = FakeReader(predict_sequence_length=predict_sequence_length)
valid_loader = tf.data.Dataset.from_generator(
    valid_reader.iter,
    ({"x": tf.float32, "encoder_feature": tf.float32, "decoder_feature": tf.float32}, tf.float32),
)
valid_loader = valid_loader.batch(batch_size=1)

config = AutoConfig.for_model("seq2seq")
model = AutoModel.from_config(config, predict_sequence_length=predict_sequence_length)
trainer = KerasTrainer(model)
trainer.train(train_dataset=train_loader, valid_dataset=valid_loader, epochs=1)

Prepare custom model config

from tfts import AutoModel, AutoConfig

config = AutoConfig.for_model('rnn')
print(config)
config.rnn_hidden_size = 128

model = AutoModel.from_config(config, predict_sequence_length=7)

Build your own model

Full list of tfts AutoModel supported
  • rnn
  • tcn
  • bert
  • nbeats
  • dlinear
  • seq2seq
  • wavenet
  • transformer
  • informer
  • autoformer

You could build the custom model based on tfts, like

  • add custom-defined embeddings for categorical variables
  • add custom-defined head layers for classification or anomaly task
import tensorflow as tf
from tensorflow.keras.layers import Input, Dense
from tfts import AutoModel, AutoConfig

train_length = 24
num_train_features = 15
predict_sequence_length = 8

def build_model():
    inputs = Input([train_length, num_train_features])
    config = AutoConfig.for_model("seq2seq")
    backbone = AutoModel.from_config(config, predict_sequence_length=predict_sequence_length)
    outputs = backbone(inputs)
    outputs = Dense(1, activation="sigmoid")(outputs)
    model = tf.keras.Model(inputs=inputs, outputs=outputs)
    model.compile(loss="mse", optimizer="rmsprop")
    return model

Examples

Citation

If you find tfts project useful in your research, please consider cite:

@misc{tfts2020,
  author = {Longxing Tan},
  title = {Time series prediction},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/longxingtan/time-series-prediction}},
}

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tfts-0.0.19.tar.gz (76.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tfts-0.0.19-py3-none-any.whl (106.4 kB view details)

Uploaded Python 3

File details

Details for the file tfts-0.0.19.tar.gz.

File metadata

  • Download URL: tfts-0.0.19.tar.gz
  • Upload date:
  • Size: 76.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.11 Linux/6.11.0-1018-azure

File hashes

Hashes for tfts-0.0.19.tar.gz
Algorithm Hash digest
SHA256 f1cd4c6e2afafa99d797b7c466aa4bf6d1a88a9ac231a78479dd4204bcda7c61
MD5 7ec43cc9a18403c1647ce217bccf92e8
BLAKE2b-256 8f3d1c89ade3b9df5e90d43a0e44139e39772ab2e737600301a12987c5cd0ed1

See more details on using hashes here.

File details

Details for the file tfts-0.0.19-py3-none-any.whl.

File metadata

  • Download URL: tfts-0.0.19-py3-none-any.whl
  • Upload date:
  • Size: 106.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.11 Linux/6.11.0-1018-azure

File hashes

Hashes for tfts-0.0.19-py3-none-any.whl
Algorithm Hash digest
SHA256 d310f1fc8a32cc9110534884685c1e27d1befbe778fd0f5d2cf6134f4c883cb6
MD5 bda57198cb221f08ebaee7fb25db8e19
BLAKE2b-256 cd525a43d51fbbce96e47250dd4d46d8780422f156392dbba466cc4404ef580f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page