Tools for benchmarking, metrics, and models.
Project description
randomstatsmodels
Check out medium story here: Medium Story
Lightweight utilities for benchmarking, forecasting, and statistical modeling — with simple Auto* model wrappers that tune hyperparameters for you.
Installation
pip install randomstatsmodels
Requires: Python 3.9+ and NumPy.
Quick Start
from randomstatsmodels import AutoNEO, AutoFourier, AutoKNN, AutoPolymath, AutoThetaAR
import numpy as np
# Toy data: sine wave + noise
rng = np.random.default_rng(42)
t = np.arange(200)
y = np.sin(2*np.pi*t/24) + 0.1*rng.normal(size=t.size)
h = 12 # forecast horizon
model = AutoNEO().fit(y)
yhat = model.predict(h)
print("Forecast:", yhat[:5])
Models
Each Auto* class:
- accepts a parameter grid (or uses sensible defaults),
- fits/evaluates candidates using a chosen metric,
- exposes a unified API:
.fit(y[, X])and.predict(h).
AutoNEO
from randomstatsmodels import AutoNEO
neo = AutoNEO(
param_grid={"n_components": [8, 16, 32]},
metric="mae",
)
neo.fit(y)
print("Best params:", neo.best_params_)
print("Prediction:", neo.predict(h))
AutoFourier
from randomstatsmodels import AutoFourier
fourier = AutoFourier(
param_grid={"season_length": [12, 24], "n_terms": [3, 5]},
metric="smape",
)
fourier.fit(y)
print("Prediction:", fourier.predict(h))
AutoKNN
from randomstatsmodels import AutoKNN
knn = AutoKNN(
param_grid={"k": [3, 5, 7], "window": [12, 24]},
metric="rmse",
)
knn.fit(y)
print("Prediction:", knn.predict(h))
AutoPolymath
from randomstatsmodels import AutoPolymath
poly = AutoPolymath(
param_grid={"degree": [2, 3], "ridge": [0.0, 0.1]},
metric="mae",
)
poly.fit(y)
print("Prediction:", poly.predict(h))
AutoThetaAR
from randomstatsmodels import AutoThetaAR
theta = AutoThetaAR(
param_grid={"theta": [0.5, 1.0, 2.0]},
metric="mape",
)
theta.fit(y)
print("Prediction:", theta.predict(h))
AutoHybridForecaster
from randomstatsmodels import AutoHybridForecaster
hybrid = AutoHybridForecaster(
candidate_fourier=(0, 3, 6),
candidate_trend=(0, 1),
candidate_ar=(0, 3, 5),
candidate_hidden=(8, 16, 32),
)
hybrid.fit(y)
print("Best config:", hybrid.best_config)
print("Prediction:", hybrid.predict(h))
AutoMELD
from randomstatsmodels import AutoMELD
meld = AutoMELD(
lags_grid=(8, 12),
scales_grid=((1, 3, 7), (1, 2, 4, 8)),
rff_features_grid=(64, 128),
)
meld.fit(y)
print("Best config:", meld.best_["config"])
print("Prediction:", meld.predict(h))
AutoPALF
from randomstatsmodels import AutoPALF
palf = AutoPALF(
p_candidates=(4, 8, 12),
penalties=("huber", "l2"),
)
palf.fit(y)
print("Validation score:", palf.best_["val_score"])
print("Prediction:", palf.predict(h))
AutoNaive
Essential baseline forecasters for proper model evaluation.
from randomstatsmodels import AutoNaive
naive = AutoNaive(
method_options=("last", "seasonal", "drift", "mean"),
seasonal_periods=(1, 7, 12, 24),
)
naive.fit(y)
print("Best config:", naive.best_["config"])
print("Prediction:", naive.predict(h))
Methods:
"last": Repeat the last observed value"seasonal": Repeat values from one seasonal period ago"drift": Linear extrapolation from first to last value"mean": Rolling or global mean
AutoHoltWinters
Classic Holt-Winters exponential smoothing with level, trend, and seasonal components.
from randomstatsmodels import AutoHoltWinters
hw = AutoHoltWinters(
seasonal_periods=(12, 24),
trend_options=("add", "none", "damped"),
seasonal_options=("add", "none"),
)
hw.fit(y)
print("Best config:", hw.best_["config"])
print("Prediction:", hw.predict(h))
AutoSSA
Singular Spectrum Analysis - decomposes time series using SVD to discover adaptive oscillatory modes.
from randomstatsmodels import AutoSSA
ssa = AutoSSA(
window_fracs=(0.25, 0.33, 0.5),
n_components_grid=(None, 2, 4, 8),
)
ssa.fit(y)
print("Best config:", ssa.best_["config"])
print("Prediction:", ssa.predict(h))
AutoLocalLinear
Weighted local regression with exponential decay for older observations.
from randomstatsmodels import AutoLocalLinear
ll = AutoLocalLinear(
decay_grid=(0.9, 0.95, 0.98, 1.0),
degree_grid=(1, 2),
)
ll.fit(y)
print("Best config:", ll.best_["config"])
print("Prediction:", ll.predict(h))
AutoEnsemble
Combines multiple base forecasters with learned weights using validation performance.
from randomstatsmodels import AutoEnsemble
ensemble = AutoEnsemble(
weighting_options=("uniform", "validation", "optimal"),
)
ensemble.fit(y)
print("Best config:", ensemble.best_["config"])
print("Base model scores:", ensemble.base_scores_)
print("Prediction:", ensemble.predict(h))
Weighting methods:
"uniform": Equal weights for all models"validation": Weights inversely proportional to validation error"optimal": Solve for weights that minimize validation error
AutoRIFT (Novel: Recursive Information Flow Tensor)
A cutting-edge forecasting model based on original "Predictive Information Field Dynamics" theory.
RIFT introduces a fundamentally new paradigm: instead of modeling values directly, it models how predictive information flows and transforms between different temporal channels (level, trend, curvature, oscillations) as the forecast horizon increases.
from randomstatsmodels import AutoRIFT
rift = AutoRIFT(
n_frequencies_grid=(2, 4, 6),
embedding_dim_grid=(2, 3),
regularization_grid=(0.001, 0.01),
)
rift.fit(y)
print("Best config:", rift.best_["config"])
print("Prediction:", rift.predict(h))
# Analyze which channels hold predictive information
info = rift.get_information_analysis(horizon=5)
print("Information by channel:", info)
Theoretical Innovation:
- Information Channels: Decomposes predictive power into orthogonal channels (level, trend, curvature, spectral components)
- Information Flow Matrix: Learns how information transfers between channels as horizon increases
- Fisher Information Estimation: Uses local variance reduction to estimate channel informativeness
- Adaptive Reconstruction: Combines channel extrapolations weighted by propagated information state
Benchmarks
All models benchmarked on two classic time series datasets with 12-step ahead forecasting.
Airline Passengers Dataset
Monthly airline passenger numbers (1949-1960). Classic Box-Jenkins dataset with trend and seasonality.
| Model | MAE | RMSE |
|---|---|---|
| AutoLocalLinear | 13.43 | 17.30 |
| AutoPolymath | 14.39 | 17.44 |
| AutoNEO | 15.85 | 18.89 |
| AutoMELD | 24.98 | 29.53 |
| AutoSSA | 36.20 | 43.06 |
| AutoHoltWinters | 45.02 | 60.23 |
| AutoNaive | 47.83 | 50.71 |
| AutoFourier | 58.66 | 78.82 |
| AutoPALF | 60.07 | 83.03 |
| AutoKNN | 60.32 | 65.23 |
| AutoEnsemble | 61.47 | 85.20 |
| AutoThetaAR | 66.77 | 93.18 |
| AutoRIFT | 130.64 | 155.99 |
Sunspots Dataset
Monthly sunspot numbers. Classic cyclical dataset without strong trend.
| Model | MAE | RMSE |
|---|---|---|
| AutoPALF | 5.65 | 7.05 |
| AutoRIFT | 5.73 | 8.05 |
| AutoPolymath | 5.91 | 7.10 |
| AutoNEO | 6.95 | 8.23 |
| AutoThetaAR | 7.40 | 8.51 |
| AutoNaive | 9.74 | 10.86 |
| AutoMELD | 12.15 | 13.96 |
| AutoEnsemble | 12.20 | 13.58 |
| AutoSSA | 13.85 | 16.08 |
| AutoKNN | 14.55 | 17.38 |
| AutoFourier | 18.37 | 19.73 |
| AutoHoltWinters | 24.32 | 25.84 |
| AutoLocalLinear | 30.30 | 32.70 |
Key Observations:
- AutoLocalLinear excels on trending data (Airline Passengers)
- AutoRIFT performs excellently on cyclical/stationary data (Sunspots, 2nd place)
- AutoPALF and AutoPolymath show consistent performance across both datasets
- Model performance varies significantly by data characteristics - no single model dominates
Metrics
Available out of the box:
from randomstatsmodels.metrics import mae, mse, rmse, mape, smape
License
MIT © 2025 Jacob Wright
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file randomstatsmodels-1.4.0.tar.gz.
File metadata
- Download URL: randomstatsmodels-1.4.0.tar.gz
- Upload date:
- Size: 63.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7ce44334bd7c303896a5e8f517fb4b06dd4ad269806aa3857a60ac443487ec3c
|
|
| MD5 |
f77f13c8e68a3d4a4aba8a0d290d76ae
|
|
| BLAKE2b-256 |
89ade329ce3df636722cdead4efe843725c02b361718e4a222e7a9ef06a9d046
|
File details
Details for the file randomstatsmodels-1.4.0-py3-none-any.whl.
File metadata
- Download URL: randomstatsmodels-1.4.0-py3-none-any.whl
- Upload date:
- Size: 76.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fb4c14dd40752f9d02008d70da4be6c39cab3f473ae9bda32c060b48e8847df2
|
|
| MD5 |
5c68f13e6f3809bcbfd75adc9450a75a
|
|
| BLAKE2b-256 |
7d040edef1827c5eabbfe0b37ed864c0688877df22fc9b4dffd58afdd63898e8
|