Skip to main content

Hierarchical Methods Time series forecasting

Project description

Nixtla  Slack

Hierarchical Forecast 👑

Probabilistic hierarchical forecasting with statistical and econometric methods

CI Python PyPi conda-nixtla License

HierarchicalForecast offers a collection of reconciliation methods, including BottomUp, TopDown, MiddleOut, MinTrace and ERM. And Probabilistic coherent predictions including Normality, Bootstrap, and PERMBU.

📚 Intro

A vast amount of time series datasets are organized into structures with different levels or hierarchies of aggregation. Examples include categories, brands, or geographical groupings. Coherent forecasts across levels are necessary for consistent decision-making and planning. Hierachical Forecast offers different reconciliation methods that render coherent forecasts across hierachies. Until recent, this methods were mainly avaiable in the R ecosystem. This Python-based framework aims to bridge the gap between statistical modeling and Machine Learning in the time series field.

🎊 Features

  • Classic reconciliation methods:
    • BottomUp: Simple addition to the upper levels.
    • TopDown: Distributes the top levels forecasts trough the hierarchies.
  • Alternative reconciliation methods:
    • MiddleOut: It anchors the base predictions in a middle level. The levels above the base predictions use the bottom-up approach, while the levels below use a top-down.
    • MinTrace: Minimizes the total forecast variance of the space of coherent forecasts, with the Minimum Trace reconciliation.
    • ERM: Optimizes the reconciliation matrix minimizing an L1 regularized objective.
  • Probabilistic coherent methods:
    • Normality: Uses MinTrace variance-covariance closed form matrix under a normality assumption.
    • Bootstrap: Generates distribution of hierarchically reconciled predictions using Gamakumara's bootstrap approach.
    • PERMBU: Reconciles independent sample predictions by reinjecting multivariate dependence with estimated rank permutation copulas, and performing a Bottom-Up aggregation.

Missing something? Please open an issue here or write us in Slack

📖 Why?

Short: We want to contribute to the ML field by providing reliable baselines and benchmarks for hierarchical forecasting task in industry and academia. Here's the complete paper.

Verbose: HierarchicalForecast integrates publicly available processed datasets, evaluation metrics, and a curated set of standard statistical baselines. In this library we provide usage examples and references to extensive experiments where we showcase the baseline's use and evaluate the accuracy of their predictions. With this work, we hope to contribute to Machine Learning forecasting by bridging the gap to statistical and econometric modeling, as well as providing tools for the development of novel hierarchical forecasting algorithms rooted in a thorough comparison of these well-established models. We intend to continue maintaining and increasing the repository, promoting collaboration across the forecasting community.

💻 Installation

You can install HierarchicalForecast's the Python package index pip with:

pip install hierarchicalforecast

You can also can install HierarchicalForecast's from conda with:

conda install -c conda-forge hierarchicalforecast

🧬 How to use

The following example needs statsforecast and datasetsforecast as additional packages. If not installed, install it via your preferred method, e.g. pip install statsforecast datasetsforecast. The datasetsforecast library allows us to download hierarhical datasets and we will use statsforecast to compute the base forecasts to be reconciled.

You can open a complete example in Colab Open In Colab

Minimal Example:

# !pip install -U numba statsforecast datasetsforecast
import numpy as np
import pandas as pd

#obtain hierarchical dataset
from datasetsforecast.hierarchical import HierarchicalData

# compute base forecast no coherent
from statsforecast.core import StatsForecast
from statsforecast.models import AutoARIMA, Naive

#obtain hierarchical reconciliation methods and evaluation
from hierarchicalforecast.core import HierarchicalReconciliation
from hierarchicalforecast.evaluation import HierarchicalEvaluation
from hierarchicalforecast.methods import BottomUp, TopDown, MiddleOut


# Load TourismSmall dataset
Y_df, S, tags = HierarchicalData.load('./data', 'TourismSmall')
Y_df['ds'] = pd.to_datetime(Y_df['ds'])

#split train/test sets
Y_test_df  = Y_df.groupby('unique_id').tail(4)
Y_train_df = Y_df.drop(Y_test_df.index)

# Compute base auto-ARIMA predictions
fcst = StatsForecast(df=Y_train_df,
                     models=[AutoARIMA(season_length=4), Naive()],
                     freq='Q', n_jobs=-1)
Y_hat_df = fcst.forecast(h=4)

# Reconcile the base predictions
reconcilers = [
    BottomUp(),
    TopDown(method='forecast_proportions'),
    MiddleOut(middle_level='Country/Purpose/State',
              top_down_method='forecast_proportions')
]
hrec = HierarchicalReconciliation(reconcilers=reconcilers)
Y_rec_df = hrec.reconcile(Y_hat_df=Y_hat_df, Y_df=Y_train_df,
                          S=S, tags=tags)

Evaluation

Assumes you have a test dataframe.

def mse(y, y_hat):
    return np.mean((y-y_hat)**2)

evaluator = HierarchicalEvaluation(evaluators=[mse])
evaluator.evaluate(Y_hat_df=Y_rec_df, Y_test_df=Y_test_df.set_index('unique_id'),
                   tags=tags, benchmark='Naive')

📖 Documentation (WIP)

Here is a link to the documentation.

📃 License

This project is licensed under the MIT License - see the LICENSE file for details.

🏟 HTS projects

In the R ecosystem, we recommend checking out fable, and the now-retired hts. In Python we want to acknowledge the following libraries hiere2e, sktime, darts, pyhts, scikit-hts.

📚 References and Acknowledgements

This work is highly influenced by the fantastic work of previous contributors and other scholars who previously proposed the reconciliation methods presented here. We want to highlight the work of Rob Hyndman, George Athanasopoulos, Shanika L. Wickramasuriya, Souhaib Ben Taieb, and Bonsoo Koo. For a full reference link, please visit the Reference section of this paper. We encourage users to explore this literature review.

🙏 How to cite

If you enjoy or benefit from using these Python implementations, a citation to this hierarchical forecasting reference paper will be greatly appreciated.

@article{olivares2022hierarchicalforecast,
    author    = {Kin G. Olivares and
                 Federico Garza and 
                 David Luo and 
                 Cristian Challú and
                 Max Mergenthaler and
                 Souhaib Ben Taieb and
                 Shanika L. Wickramasuriya and
                 Artur Dubrawski},
    title     = {{HierarchicalForecast}: A Reference Framework for Hierarchical Forecasting in Python},
    journal   = {Work in progress paper, submitted to Journal of Machine Learning Research.},
    volume    = {abs/2207.03517},
    year      = {2022},
    url       = {https://arxiv.org/abs/2207.03517},
    archivePrefix = {arXiv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hierarchicalforecast-0.4.3.tar.gz (52.9 kB view details)

Uploaded Source

Built Distribution

hierarchicalforecast-0.4.3-py3-none-any.whl (50.2 kB view details)

Uploaded Python 3

File details

Details for the file hierarchicalforecast-0.4.3.tar.gz.

File metadata

  • Download URL: hierarchicalforecast-0.4.3.tar.gz
  • Upload date:
  • Size: 52.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for hierarchicalforecast-0.4.3.tar.gz
Algorithm Hash digest
SHA256 e16e700676a913418d9eef53ce7e1c2e9ed5a31ee8f131cd7bd4f2b46cbfe413
MD5 9d631631f11f6f4f13d9796633d4397d
BLAKE2b-256 851b5c9616c35444b9d557724dde63df7d14d740871252e7fcf3ba310961a1b8

See more details on using hashes here.

File details

Details for the file hierarchicalforecast-0.4.3-py3-none-any.whl.

File metadata

File hashes

Hashes for hierarchicalforecast-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 dd57198c3838d8c2fbf75e3170c2bf922556551953ae3aac76b964ae82680561
MD5 a25b16e8ef57211786dc35e310433f33
BLAKE2b-256 fc33cd9dee9a0da325ce047ac1d3b1f3aacd003b963ce7c9bc19ac6666178866

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page