Skip to main content

Multi Time Series Encoders

Project description

Multi Time Series Encoders

The objective of this python package is to make easy the encoding and the classification/regression of multivariate time series (mts) data even when these are asynchronous. We say that data are of type mts when each observation is associated with multiple time series (e.g. the vital signs of a patient at a specific period).

Installation

The current version has been developed in Python 3.7. It also works in Python 3.8. If you encounter an issue, please try to run it again in a virtual machine containing Python 3.7 or 3.8.

pip install mtse

Sample code

import mtse

### Load sample data ###
train, val, test, norm = mtse.get_sample(return_norm=True)

### Using the class `mtse` ###
mtan = mtse.mtse(device='cuda', seed=1, experiment_id='mtan')
mtan.load_data(train, val, test, norm=norm)
mtan.build_model('mtan', 'regression', learn_emb=True, early_stop=10, cuda_empty_cache=True)
mtan.train(lossf='mape', n_iters=200, save_startegy='best')
mtan.predict(checkpoint='best')
mtan.encode_ts(data_to_embed='test', embed_pandas=True)

More details and examples in the documentation

What can be implemented / improved

Encoders

  • mTAN - Multi Time Attention Network - encoder
  • mTAN - Multi Time Attention Network - encoder-decoder
  • SeFT - Set Function for Time series
  • STraTS - Self-supervised Transformer for Time-Series
  • ODE-based encoders

Note that we only implemented the mTAN encoder as a baseline for now. At this stage, this model works only for supervised learning, meaning that it uses the target variable to compute the loss and update the encoder weights. Thus, the priority would be to implement an unsupervised encoder next (encoder-decoder models or self-supervised encoders).

Other features

  • Cross-validation evaluation, prediction and encoding
  • Support for other data inputs in the dataset classes (currently the mtan_Dataset class)
  • Support for time-series forecasting and inference tasks

References

Satya Narayan Shukla and Benjamin Marlin, "Multi-Time Attention Networks for Irregularly Sampled Time Series", International Conference on Learning Representations, 2021.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mtse-0.1.5.6.tar.gz (15.4 kB view details)

Uploaded Source

Built Distribution

mtse-0.1.5.6-py3-none-any.whl (1.6 MB view details)

Uploaded Python 3

File details

Details for the file mtse-0.1.5.6.tar.gz.

File metadata

  • Download URL: mtse-0.1.5.6.tar.gz
  • Upload date:
  • Size: 15.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mtse-0.1.5.6.tar.gz
Algorithm Hash digest
SHA256 7f13450f4514b98cacde86fde9ff29ab9a89b3ad7047f6fe02611bada6b2285a
MD5 d2fbe694086fbddde22266e105ac1f33
BLAKE2b-256 4cde824268e5cf28adad7deb020e13f4c23394f74cdbb0b0fb2ec600c5135560

See more details on using hashes here.

File details

Details for the file mtse-0.1.5.6-py3-none-any.whl.

File metadata

  • Download URL: mtse-0.1.5.6-py3-none-any.whl
  • Upload date:
  • Size: 1.6 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mtse-0.1.5.6-py3-none-any.whl
Algorithm Hash digest
SHA256 4f1fa7cd1f09d5f4bf52275ef6824e5beb6b0d5eb54f6df8b929b9267dda1553
MD5 707213b52bf090171ea1a09f9336fc20
BLAKE2b-256 2e6c6c99ce3c42fad7bea4e112fe388f3e2f8f15bfe861363079c01f175c3c57

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page