A python library for easy manipulation and forecasting of time series.
Project description
Time Series Made Easy in Python
Darts is a Python library for user-friendly forecasting and anomaly detection
on time series. It contains a variety of models, from classics such as ARIMA to
deep neural networks. The forecasting models can all be used in the same way,
using fit()
and predict()
functions, similar to scikit-learn.
The library also makes it easy to backtest models,
combine the predictions of several models, and take external data into account.
Darts supports both univariate and multivariate time series and models.
The ML-based models can be trained on potentially large datasets containing multiple time
series, and some of the models offer a rich support for probabilistic forecasting.
Darts also offers extensive anomaly detection capabilities. For instance, it is trivial to apply PyOD models on time series to obtain anomaly scores, or to wrap any of Darts forecasting or filtering models to obtain fully fledged anomaly detection models.
Documentation
High Level Introductions
Articles on Selected Topics
- Training Models on Multiple Time Series
- Using Past and Future Covariates
- Temporal Convolutional Networks and Forecasting
- Probabilistic Forecasting
- Transfer Learning for Time Series Forecasting
- Hierarchical Forecast Reconciliation
Quick Install
We recommend to first setup a clean Python environment for your project with Python 3.8+ using your favorite tool (conda, venv, virtualenv with or without virtualenvwrapper).
Once your environment is set up you can install darts using pip:
pip install darts
For more details you can refer to our installation instructions.
Example Usage
Forecasting
Create a TimeSeries
object from a Pandas DataFrame, and split it in train/validation series:
import pandas as pd
from darts import TimeSeries
# Read a pandas DataFrame
df = pd.read_csv("AirPassengers.csv", delimiter=",")
# Create a TimeSeries, specifying the time and value columns
series = TimeSeries.from_dataframe(df, "Month", "#Passengers")
# Set aside the last 36 months as a validation series
train, val = series[:-36], series[-36:]
Fit an exponential smoothing model, and make a (probabilistic) prediction over the validation series' duration:
from darts.models import ExponentialSmoothing
model = ExponentialSmoothing()
model.fit(train)
prediction = model.predict(len(val), num_samples=1000)
Plot the median, 5th and 95th percentiles:
import matplotlib.pyplot as plt
series.plot()
prediction.plot(label="forecast", low_quantile=0.05, high_quantile=0.95)
plt.legend()
Anomaly Detection
Load a multivariate series, trim it, keep 2 components, split train and validation sets:
from darts.datasets import ETTh2Dataset
series = ETTh2Dataset().load()[:10000][["MUFL", "LULL"]]
train, val = series.split_before(0.6)
Build a k-means anomaly scorer, train it on the train set and use it on the validation set to get anomaly scores:
from darts.ad import KMeansScorer
scorer = KMeansScorer(k=2, window=5)
scorer.fit(train)
anom_score = scorer.score(val)
Build a binary anomaly detector and train it over train scores, then use it over validation scores to get binary anomaly classification:
from darts.ad import QuantileDetector
detector = QuantileDetector(high_quantile=0.99)
detector.fit(scorer.score(train))
binary_anom = detector.detect(anom_score)
Plot (shifting and scaling some of the series to make everything appear on the same figure):
import matplotlib.pyplot as plt
series.plot()
(anom_score / 2. - 100).plot(label="computed anomaly score", c="orangered", lw=3)
(binary_anom * 45 - 150).plot(label="detected binary anomaly", lw=4)
Features
-
Forecasting Models: A large collection of forecasting models; from statistical models (such as ARIMA) to deep learning models (such as N-BEATS). See table of models below.
-
Anomaly Detection The
darts.ad
module contains a collection of anomaly scorers, detectors and aggregators, which can all be combined to detect anomalies in time series. It is easy to wrap any of Darts forecasting or filtering models to build a fully fledged anomaly detection model that compares predictions with actuals. ThePyODScorer
makes it trivial to use PyOD detectors on time series. -
Multivariate Support:
TimeSeries
can be multivariate - i.e., contain multiple time-varying dimensions instead of a single scalar value. Many models can consume and produce multivariate series. -
Multiple series training (global models): All machine learning based models (incl. all neural networks) support being trained on multiple (potentially multivariate) series. This can scale to large datasets too.
-
Probabilistic Support:
TimeSeries
objects can (optionally) represent stochastic time series; this can for instance be used to get confidence intervals, and many models support different flavours of probabilistic forecasting (such as estimating parametric distributions or quantiles). Some anomaly detection scorers are also able to exploit these predictive distributions. -
Past and Future Covariates support: Many models in Darts support past-observed and/or future-known covariate (external data) time series as inputs for producing forecasts.
-
Static Covariates support: In addition to time-dependent data,
TimeSeries
can also contain static data for each dimension, which can be exploited by some models. -
Hierarchical Reconciliation: Darts offers transformers to perform reconciliation. These can make the forecasts add up in a way that respects the underlying hierarchy.
-
Regression Models: It is possible to plug-in any scikit-learn compatible model to obtain forecasts as functions of lagged values of the target series and covariates.
-
Explainability: Darts has the ability to explain some forecasting models using Shap values.
-
Data processing: Tools to easily apply (and revert) common transformations on time series data (scaling, filling missing values, differencing, boxcox, ...)
-
Metrics: A variety of metrics for evaluating time series' goodness of fit; from R2-scores to Mean Absolute Scaled Error.
-
Backtesting: Utilities for simulating historical forecasts, using moving time windows.
-
PyTorch Lightning Support: All deep learning models are implemented using PyTorch Lightning, supporting among other things custom callbacks, GPUs/TPUs training and custom trainers.
-
Filtering Models: Darts offers three filtering models:
KalmanFilter
,GaussianProcessFilter
, andMovingAverageFilter
, which allow to filter time series, and in some cases obtain probabilistic inferences of the underlying states/values. -
Datasets The
darts.datasets
submodule contains some popular time series datasets for rapid and reproducible experimentation.
Forecasting Models
Here's a breakdown of the forecasting models currently implemented in Darts. We are constantly working on bringing more models and features.
Model | Sources | Target Series Support: Univariate/ Multivariate |
Covariates Support: Past-observed/ Future-known/ Static |
Probabilistic Forecasting: Sampled/ Distribution Parameters |
Training & Forecasting on Multiple Series |
---|---|---|---|---|---|
Baseline Models (LocalForecastingModel) |
|||||
NaiveMean | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
NaiveSeasonal | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
NaiveDrift | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
NaiveMovingAverage | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
Statistical / Classic Models (LocalForecastingModel) |
|||||
ARIMA | ๐ฉ ๐ฅ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ | |
VARIMA | ๐ฅ ๐ฉ | ๐ฅ ๐ฉ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
AutoARIMA | ๐ฉ ๐ฅ | ๐ฅ ๐ฉ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
StatsForecastAutoArima (faster AutoARIMA) | Nixtla's statsforecast | ๐ฉ ๐ฅ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
ExponentialSmoothing | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ | |
StatsforecastAutoETS | Nixtla's statsforecast | ๐ฉ ๐ฅ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
StatsforecastAutoCES | Nixtla's statsforecast | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ |
BATS and TBATS | TBATS paper | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
Theta and FourTheta | Theta & 4 Theta | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ |
StatsForecastAutoTheta | Nixtla's statsforecast | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
Prophet (see install notes) | Prophet repo | ๐ฉ ๐ฅ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
FFT (Fast Fourier Transform) | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
KalmanForecaster using the Kalman filter and N4SID for system identification | N4SID paper | ๐ฉ ๐ฉ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฅ | ๐ฅ |
Croston method | ๐ฉ ๐ฅ | ๐ฅ ๐ฅ ๐ฅ | ๐ฅ ๐ฅ | ๐ฅ | |
Regression Models (GlobalForecastingModel) |
|||||
RegressionModel: generic wrapper around any sklearn regression model | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฅ ๐ฅ | ๐ฉ | |
LinearRegressionModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ | |
RandomForest | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฅ ๐ฅ | ๐ฉ | |
LightGBMModel, | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ | |
XGBModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ | |
CatBoostModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ | |
PyTorch (Lightning)-based Models (GlobalForecastingModel) |
|||||
RNNModel (incl. LSTM and GRU); equivalent to DeepAR in its probabilistic version | DeepAR paper | ๐ฉ ๐ฉ | ๐ฅ ๐ฉ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ |
BlockRNNModel (incl. LSTM and GRU) | ๐ฉ ๐ฉ | ๐ฉ ๐ฅ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ | |
NBEATSModel | N-BEATS paper | ๐ฉ ๐ฉ | ๐ฉ ๐ฅ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ |
NHiTSModel | N-HiTS paper | ๐ฉ ๐ฉ | ๐ฉ ๐ฅ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ |
TCNModel | TCN paper, DeepTCN paper, blog post | ๐ฉ ๐ฉ | ๐ฉ ๐ฅ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ |
TransformerModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฅ ๐ฅ | ๐ฉ ๐ฉ | ๐ฉ | |
TFTModel (Temporal Fusion Transformer) | TFT paper, PyTorch Forecasting | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ |
DLinearModel | DLinear paper | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ |
NLinearModel | NLinear paper | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ |
TiDEModel | TiDE paper | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ |
Ensemble Models (GlobalForecastingModel): Model support is dependent on ensembled forecasting models and the ensemble model itself |
|||||
NaiveEnsembleModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ | |
RegressionEnsembleModel | ๐ฉ ๐ฉ | ๐ฉ ๐ฉ ๐ฉ | ๐ฉ ๐ฉ | ๐ฉ |
Community & Contact
Anyone is welcome to join our Gitter room to ask questions, make proposals, discuss use-cases, and more. If you spot a bug or have suggestions, GitHub issues are also welcome.
If what you want to tell us is not suitable for Gitter or Github, feel free to send us an email at darts@unit8.co for darts related matters or info@unit8.co for any other inquiries.
Contribute
The development is ongoing, and we welcome suggestions, pull requests and issues on GitHub. All contributors will be acknowledged on the change log page.
Before working on a contribution (a new feature or a fix), check our contribution guidelines.
Citation
If you are using Darts in your scientific work, we would appreciate citations to the following JMLR paper.
Darts: User-Friendly Modern Machine Learning for Time Series
Bibtex entry:
@article{JMLR:v23:21-1177,
author = {Julien Herzen and Francesco Lรยคssig and Samuele Giuliano Piazzetta and Thomas Neuer and Lรยฉo Tafti and Guillaume Raille and Tomas Van Pottelbergh and Marek Pasieka and Andrzej Skrodzki and Nicolas Huguenin and Maxime Dumonal and Jan Koร
โบcisz and Dennis Bader and Frรยฉdรยฉrick Gusset and Mounir Benheddi and Camila Williamson and Michal Kosinski and Matej Petrik and Gaรยซl Grosch},
title = {Darts: User-Friendly Modern Machine Learning for Time Series},
journal = {Journal of Machine Learning Research},
year = {2022},
volume = {23},
number = {124},
pages = {1-6},
url = {http://jmlr.org/papers/v23/21-1177.html}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file u8darts-0.25.0.tar.gz
.
File metadata
- Download URL: u8darts-0.25.0.tar.gz
- Upload date:
- Size: 641.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.11
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f96c64739f036f2764c2fad72bb8c401edf486c8e75ed98d55c56ad63b8c720c |
|
MD5 | 8fd1b4667ba6c9cba9a3a45599a9ee95 |
|
BLAKE2b-256 | aac49439c4a4dbc5b17f6a81cb9ccd1949727713874ccaae57633c0fc9fa436a |
File details
Details for the file u8darts-0.25.0-py3-none-any.whl
.
File metadata
- Download URL: u8darts-0.25.0-py3-none-any.whl
- Upload date:
- Size: 760.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.11
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f3f7a8fb1910abefb33c3dab6c3e369de57b9e77b72585a597237c860f284954 |
|
MD5 | 7968e9877a31035da3a4610fba2b71fe |
|
BLAKE2b-256 | 60edacc5d7365db95b4495c64d4f84fe3277dbb7ace23e34f1e61e14d08b7b40 |