Skip to main content

Python implementation of soft-DTW

Project description

Python implementation of soft-DTW.

What is it?

The celebrated dynamic time warping (DTW) [1] defines the discrepancy between two time series, of possibly variable length, as their minimal alignment cost. Although the number of possible alignments is exponential in the length of the two time series, [1] showed that DTW can be computed in only quadractic time using dynamic programming.

Soft-DTW [2] proposes to replace this minimum by a soft minimum. Like the original DTW, soft-DTW can be computed in quadratic time using dynamic programming. However, the main advantage of soft-DTW stems from the fact that it is differentiable everywhere and that its gradient can also be computed in quadratic time. This enables to use soft-DTW for time series averaging or as a loss function, between a ground-truth time series and a time series predicted by a neural network, trained end-to-end using backpropagation.

Supported features

  • soft-DTW (forward pass) and gradient (backward pass) computations, implemented in Cython for speed

  • barycenters (time series averaging)

  • dataset loader for the UCR archive

  • Chainer function

Planned features

  • PyTorch function

Example

from sdtw import SoftDTW
from sdtw.distance import SquaredEuclidean

# Time series 1: numpy array, shape = [m, d] where m = length and d = dim
X = ...
# Time series 2: numpy array, shape = [n, d] where n = length and d = dim
Y = ...

# D can also be an arbitrary distance matrix: numpy array, shape [m, n]
D = SquaredEuclidean(X, Y)
sdtw = SoftDTW(D, gamma=1.0)
# soft-DTW discrepancy, approaches DTW as gamma -> 0
value = sdtw.compute()
# gradient w.r.t. D, shape = [m, n], which is also the expected alignment matrix
E = sdtw.grad()
# gradient w.r.t. X, shape = [m, d]
G = D.jacobian_product(E)

Installation

Binary packages are not available.

This project can be installed from its git repository. It is assumed that you have a working C compiler.

  1. Obtain the sources by:

    git clone https://github.com/mblondel/soft-dtw.git

or, if git is unavailable, download as a ZIP from GitHub.

  1. Install the dependencies:

    # via pip
    
    pip install numpy scipy scikit-learn cython nose
    
    
    # via conda
    
    conda install numpy scipy scikit-learn cython nose
  2. Build and install soft-dtw:

    cd soft-dtw
    python setup.py install

References

Author

  • Mathieu Blondel, 2017

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

soft-dtw-0.1.6.tar.gz (58.1 kB view details)

Uploaded Source

Built Distribution

soft_dtw-0.1.6-cp36-cp36m-macosx_10_13_x86_64.whl (29.3 kB view details)

Uploaded CPython 3.6m macOS 10.13+ x86-64

File details

Details for the file soft-dtw-0.1.6.tar.gz.

File metadata

  • Download URL: soft-dtw-0.1.6.tar.gz
  • Upload date:
  • Size: 58.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for soft-dtw-0.1.6.tar.gz
Algorithm Hash digest
SHA256 60498ee2049a6a0b49276d26deca8abfa007a9ac9c51afbaecfb1fa403806da8
MD5 d3773eecc33cbd537c5a38ec48ea72d1
BLAKE2b-256 eca899a1c684116c73dee995ed690a1ed9217ae0eb9514c0b06fdd386c7c21da

See more details on using hashes here.

File details

Details for the file soft_dtw-0.1.6-cp36-cp36m-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for soft_dtw-0.1.6-cp36-cp36m-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 75092414fbdc31fe4c82e560aee4b4c5bd4993275e7934dea5dd24a461473f12
MD5 458e0e76bca6f9306b82f486b41e8a97
BLAKE2b-256 26e998305d8a5b37d4e0754cbdc42c9678582ca21d0838dc5efaa1c07cce82a6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page