A time-series forecasting model using UR2CUTE: Using Repetitively 2 CNN for Unsteady Timeseries Estimation
Project description
UR2CUTE
Using Repetitively 2 CNNs for Unsteady Timeseries Estimation. UR2CUTE is a dual-stage, PyTorch powered model dedicated to intermittent demand forecasting. A classifier estimates demand occurrence while a regressor predicts magnitude, allowing the library to focus on the sparse structure typical of slow-moving inventory.
Overview
Intermittent demand is dominated by long zero stretches punctuated by irregular spikes. Traditional statistical models struggle to capture both the timing and the size of those bursts. UR2CUTE tackles the problem with a hurdle-style architecture:
- A CNN classifier predicts the probability of non-zero demand for each step in the forecast horizon.
- A CNN regressor estimates the corresponding quantities.
- Final forecasts combine the two outputs through an adaptive threshold so the regressor only contributes when demand is likely.
The estimator follows the scikit-learn API, includes thorough input validation, and automatically selects CPU or GPU devices.
Features
- Pure PyTorch implementation with GPU support when available.
- Direct multi-step forecasting: predicts the entire horizon in a single forward pass.
- Automatic lag generation plus optional external covariates.
- Customizable hyperparameters (epochs, batch size, independent learning rates, dropout).
- Auto-threshold mode that derives an occurrence cutoff from the training set.
- Early stopping with persistent checkpoints stored in a temporary directory.
- Reproducible results through explicit random seed management.
- Model persistence through
save_modelandload_model. - Complete type hints and packaged type information (
py.typed).
Dependencies
- Python 3.7 or newer
- PyTorch 1.7+
- NumPy
- pandas
- scikit-learn
Installation
From PyPI
pip install UR2CUTE
From Source
git clone https://github.com/FH-Prevail/UR2CUTE_torch.git
cd UR2CUTE_torch
pip install -e .
# Optional extras
pip install -e ".[dev]"
pip install -e ".[test]"
pip install -e ".[docs]"
Verify Installation
from UR2CUTE import UR2CUTE
print(UR2CUTE.__module__)
Quick Start
import pandas as pd
import torch
from UR2CUTE import UR2CUTE
data = pd.DataFrame(
{
"date": pd.date_range("2023-01-01", periods=50, freq="W"),
"target": [0, 5, 0, 0, 12, 0, 0, 0, 7, 0] * 5,
"promo": [0, 1, 0, 0, 1, 0, 0, 1, 0, 0] * 5,
"price": [10.0, 9.5, 9.5, 9.5, 10.0, 10.0, 10.0, 9.8, 9.8, 9.8] * 5,
}
)
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
print(f"Using device: {device}")
model = UR2CUTE(
n_steps_lag=3,
forecast_horizon=4,
external_features=["promo", "price"],
threshold="auto",
)
model.fit(data, target_col="target")
print(model.predict(data))
Parameters
| Parameter | Description | Default |
|---|---|---|
n_steps_lag |
Number of lag features to generate. | 3 |
forecast_horizon |
Number of future periods predicted per call. | 8 |
external_features |
Optional list of column names used as exogenous inputs. | None |
epochs |
Training epochs for both CNN models. | 100 |
batch_size |
Training batch size. | 32 |
threshold |
Manual probability threshold or "auto" to derive from training data. |
0.5 |
patience |
Early stopping patience (epochs). | 10 |
random_seed |
Global random seed applied to NumPy, Python, and PyTorch. | 42 |
classification_lr |
Learning rate for the classifier. | 0.0021 |
regression_lr |
Learning rate for the regressor. | 0.0021 |
dropout_classification |
Dropout applied inside the classifier. | 0.4 |
dropout_regression |
Dropout applied inside the regressor. | 0.2 |
verbose |
Enables progress output and early-stopping logs. | True |
Usage Patterns
Auto Threshold
model = UR2CUTE(threshold="auto")
model.fit(df, "target")
print(model.threshold_)
External Features
covariates = ["promotion", "price", "weekday"]
model = UR2CUTE(external_features=covariates)
model.fit(df, "target")
Silent Training
model = UR2CUTE(verbose=False)
model.fit(df, "target")
Model Persistence
trained = UR2CUTE().fit(train_df, "target")
trained.save_model("production_model.pkl")
loaded = UR2CUTE.load_model("production_model.pkl")
preds = loaded.predict(new_df)
How It Works
- Preprocessing – validates the input frame, generates lag features, creates multi-step samples, and splits chronologically into train and validation partitions.
- Scaling – fits separate MinMax scalers on the training set and applies them to validation and inference data, preventing validation leakage.
- Classification Stage – trains a CNN with sigmoid output and BCE loss to estimate the probability of demand for each future horizon step.
- Regression Stage – trains a CNN regressor with MSE loss on samples that exhibit demand; if no such sequences exist, the model safely falls back to the full dataset.
- Inference – transforms the latest observed sequence, runs both networks, rescales quantities, and zeros out forecasts whose probability falls below the stored threshold.
Performance
Internal benchmarks show UR2CUTE outperforming Croston, AutoARIMA, Prophet, gradient boosted trees, and random forests on sparse demand series, especially in MAE% and RMSE%. Improvements stem from the dedicated occurrence model, lagged covariates, and the ability to learn temporal filters tuned to each dataset.
Citation
@article{mirshahi2024intermittent,
title={Intermittent Time Series Demand Forecasting Using Dual Convolutional Neural Networks},
author={Mirshahi, Sina and Brandtner, Patrick and Kominkova Oplatkova, Zuzana},
journal={MENDEL -- Soft Computing Journal},
volume={30},
number={1},
year={2024},
publisher={MENDEL Journal}
}
License
UR2CUTE is released under the MIT License. See LICENSE for the full text.
Contributors
- Sina Mirshahi
- Patrick Brandtner
- Zuzana Kominkova Oplatkova
- Taha Falatouri
- Mehran Naseri
- Farzaneh Darbanian
Acknowledgments
This work was carried out at:
- Department of Informatics and Artificial Intelligence, Tomas Bata University
- Department for Logistics, University of Applied Sciences Upper Austria, Steyr
- Josef Ressel-Centre for Predictive Value Network Intelligence, Steyr
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ur2cute-0.1.8.tar.gz.
File metadata
- Download URL: ur2cute-0.1.8.tar.gz
- Upload date:
- Size: 22.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f1c9ce7b082013f19bc02da07fc89585051fceb151af0f635f9cc06fc184c6db
|
|
| MD5 |
06681340a31adc4ed775851b8dadfac5
|
|
| BLAKE2b-256 |
1526a1781da266ba2336b0d502189e9667d6cc0677621d42d23ea130d145fae4
|
File details
Details for the file ur2cute-0.1.8-py3-none-any.whl.
File metadata
- Download URL: ur2cute-0.1.8-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cc09a37db75065e672103eb6492b97ccda05d87be3fc8867fbd805815d19c130
|
|
| MD5 |
99169d98304853ac5fd164224876f794
|
|
| BLAKE2b-256 |
118afac88db7cc4c4ab92522925217c47cf6f4cd02289a9a227b27f2cf092d5f
|