Skip to main content

A Time Series Cross-Validation library that lets you build, deploy and update composite models easily. An order of magnitude speed-up, combined with flexibility and rigour.

Project description

Docs Tests Discord Community Book a call with us!


Logo

FOLD

Fast Adaptive Time Series ML Engine
Explore the docs »


Adaptive Models

The Adaptive ML Engine that lets you build, deploy and update Models easily. An order of magnitude speed-up, combined with flexibility and rigour.

Fold works with many third party libraries

Main Features

Fold's main features

Installation

  • Prerequisites: python >= 3.7 and pip

  • Install from pypi:

    pip install fold-core
    

Quickstart

You can quickly train your chosen models and get predictions by running:

from sklearn.ensemble import RandomForestRegressor
from statsforecast.models import ARIMA
from fold import ExpandingWindowSplitter, train_evaluate
from fold.composites import Ensemble
from fold.transformations import OnlyPredictions
from fold.utils.dataset import get_preprocessed_dataset

X, y = get_preprocessed_dataset(
    "weather/historical_hourly_la", target_col="temperature", shorten=1000
)

pipeline = [
    Ensemble(
        [
            RandomForestRegressor(),
            ARIMA(order=(1, 1, 0)),
        ]
    ),
    OnlyPredictions(),
]
splitter = ExpandingWindowSplitter(initial_train_window=0.2, step=0.2)
scorecard, prediction, trained_pipelines = train_evaluate(pipeline, X, y, splitter)

Thinking of using fold? We'd love to hear about your use case and help, please book a free 30-min call with us!

(If you install krisi by running pip install krisi you get an extended report back, rather than a single metric.)

Fold is different

  • Adaptive Models and Backtesting at lightning speed.
    → fold allows to simulate and evaluate your models like they would have performed, in reality/when deployed, with clever use of paralellization and design.

  • Create composite models: ensembles, hybrids, stacking pipelines, easily.
    → Underutilized, but the easiest, fastest way to increase performance of your Time Series models.

  • Built with Distributed Computing in mind.
    → Deploy your research and development pipelines to a cluster with ray, and use modin to handle out-of-memory datasets (full support for modin is coming in April).

  • Bridging the gap between Online and Mini-Batch learning.
    → Mix and match xgboost with ARIMA, in a single pipeline. Boost your model's accuracy by updating them on every timestamp, if desired.

  • Update your deployed models, easily, as new data flows in.
    → Real world is not static. Let your models adapt, without the need to re-train from scratch.

Examples, Walkthroughs and Blog Posts

Name Type Dataset Type Docs Link Colab
⚡️ Core Walkthrough Walkthrough Energy Notebook Colab
🚄 Speed Comparison of Fold to other libraries Walkthrough Weather Notebook Colab
📚 Example Collection Example Weather & Synthetic Collection Link -
🖋️ Why we ended up building an Adaptive ML engine for Time Series Blog Public Release Blog Post Blog post on Applied Exploration -

Core Features

  • Supports both Regression and Classification tasks.
  • Online and Mini-batch learning.
  • Feature selection and other transformations on an expanding/rolling window basis
  • Use any scikit-learn/tabular model natively!
  • Use any univariate or sequence models (wrappers provided in fold-wrappers).
  • Use any Deep Learning Time Series models (wrappers provided in fold-wrappers).
  • Super easy syntax!
  • Probabilistic foreacasts (currently, for Classification, full support coming in April).
  • Hyperparemeter optimization / Model selection. (coming in early April!)

What is Adaptive Backtesting?

Adaptive Backtesting

It's like classical Backtesting / Time Series Cross-Validation, plus: Inside a test window, and during deployment, fold provides a way for models to update their parameters or access the last value. Learn more

Our Open-core Time Series Toolkit

Krisi Fold Fold/Models Fold/Wrappers

If you want to try them out, we'd love to hear about your use case and help, please book a free 30-min call with us!

Explore our Commercial License options here

Contribution

Join our Discord Community for live discussion!

Submit an issue or reach out to us on info at dream-faster.ai for any inquiries.

Licence & Usage

We want to bring much-needed transparency, speed and rigour to the process of creating Time Series ML pipelines, while also building a sustainable business, that can support the ecosystem in the long-term. Fold's licence is inbetween source-available and a traditional commercial software licence. It requires a paid licence for any commercial use, after the initial, 30 day trial period.

We also want to contribute to open research by giving free access to non-commercial, research use of fold.

Read more

Limitations

  • No intermittent time series support, very limited support for missing values.
  • No hierarchical time series support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fold_core-0.1.7.tar.gz (44.8 kB view details)

Uploaded Source

Built Distribution

fold_core-0.1.7-py3-none-any.whl (72.8 kB view details)

Uploaded Python 3

File details

Details for the file fold_core-0.1.7.tar.gz.

File metadata

  • Download URL: fold_core-0.1.7.tar.gz
  • Upload date:
  • Size: 44.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.9.16 Linux/5.15.0-1036-azure

File hashes

Hashes for fold_core-0.1.7.tar.gz
Algorithm Hash digest
SHA256 c4898efedd6593f9b9459164977e78e7f0e51740d280b5b6ad3e4e87f4016b9e
MD5 c50be34fa465d2f227a400a54691ed12
BLAKE2b-256 be9c42058deb2bc16457e2e8662c50a7c9f7743ff1756c4a8c3c41ef0fc5d893

See more details on using hashes here.

File details

Details for the file fold_core-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: fold_core-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 72.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.9.16 Linux/5.15.0-1036-azure

File hashes

Hashes for fold_core-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 45dd7e5c93db804808ee9ce55cfd774cd29a7101570969c499ef81745601ea6a
MD5 30830974916a0d04b21f1b6d1c471bc6
BLAKE2b-256 7a49df7fef84dfdea11c47959a14b6b4d89a1795b6688d5ca994c356c938e607

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page