Skip to main content

Tools for benchmarking, metrics, and models.

Project description

randomstatsmodels

Check out medium story here: Medium Story

Lightweight utilities for benchmarking, forecasting, and statistical modeling — with simple Auto* model wrappers that tune hyperparameters for you.

Installation

pip install randomstatsmodels

Requires: Python 3.9+ and NumPy.


Quick Start

from randomstatsmodels import AutoNEO, AutoFourier, AutoKNN, AutoPolymath, AutoThetaAR
import numpy as np

# Toy data: sine wave + noise
rng = np.random.default_rng(42)
t = np.arange(200)
y = np.sin(2*np.pi*t/24) + 0.1*rng.normal(size=t.size)

h = 12  # forecast horizon

model = AutoNEO().fit(y)
yhat = model.predict(h)
print("Forecast:", yhat[:5])

Models

Each Auto* class:

  • accepts a parameter grid (or uses sensible defaults),
  • fits/evaluates candidates using a chosen metric,
  • exposes a unified API: .fit(y[, X]) and .predict(h).

AutoNEO

from randomstatsmodels import AutoNEO

neo = AutoNEO(
    param_grid={"n_components": [8, 16, 32]},
    metric="mae",
)
neo.fit(y)
print("Best params:", neo.best_params_)
print("Prediction:", neo.predict(h))

AutoFourier

from randomstatsmodels import AutoFourier

fourier = AutoFourier(
    param_grid={"season_length": [12, 24], "n_terms": [3, 5]},
    metric="smape",
)
fourier.fit(y)
print("Prediction:", fourier.predict(h))

AutoKNN

from randomstatsmodels import AutoKNN

knn = AutoKNN(
    param_grid={"k": [3, 5, 7], "window": [12, 24]},
    metric="rmse",
)
knn.fit(y)
print("Prediction:", knn.predict(h))

AutoPolymath

from randomstatsmodels import AutoPolymath

poly = AutoPolymath(
    param_grid={"degree": [2, 3], "ridge": [0.0, 0.1]},
    metric="mae",
)
poly.fit(y)
print("Prediction:", poly.predict(h))

AutoThetaAR

from randomstatsmodels import AutoThetaAR

theta = AutoThetaAR(
    param_grid={"theta": [0.5, 1.0, 2.0]},
    metric="mape",
)
theta.fit(y)
print("Prediction:", theta.predict(h))

AutoHybridForecaster

from randomstatsmodels import AutoHybridForecaster

hybrid = AutoHybridForecaster(
    candidate_fourier=(0, 3, 6),
    candidate_trend=(0, 1),
    candidate_ar=(0, 3, 5),
    candidate_hidden=(8, 16, 32),
)
hybrid.fit(y)
print("Best config:", hybrid.best_config)
print("Prediction:", hybrid.predict(h))

AutoMELD

from randomstatsmodels import AutoMELD

meld = AutoMELD(
    lags_grid=(8, 12),
    scales_grid=((1, 3, 7), (1, 2, 4, 8)),
    rff_features_grid=(64, 128),
)
meld.fit(y)
print("Best config:", meld.best_["config"])
print("Prediction:", meld.predict(h))

AutoPALF

from randomstatsmodels import AutoPALF

palf = AutoPALF(
    p_candidates=(4, 8, 12),
    penalties=("huber", "l2"),
)
palf.fit(y)
print("Validation score:", palf.best_["val_score"])
print("Prediction:", palf.predict(h))

AutoNaive

Essential baseline forecasters for proper model evaluation.

from randomstatsmodels import AutoNaive

naive = AutoNaive(
    method_options=("last", "seasonal", "drift", "mean"),
    seasonal_periods=(1, 7, 12, 24),
)
naive.fit(y)
print("Best config:", naive.best_["config"])
print("Prediction:", naive.predict(h))

Methods:

  • "last": Repeat the last observed value
  • "seasonal": Repeat values from one seasonal period ago
  • "drift": Linear extrapolation from first to last value
  • "mean": Rolling or global mean

AutoHoltWinters

Classic Holt-Winters exponential smoothing with level, trend, and seasonal components.

from randomstatsmodels import AutoHoltWinters

hw = AutoHoltWinters(
    seasonal_periods=(12, 24),
    trend_options=("add", "none", "damped"),
    seasonal_options=("add", "none"),
)
hw.fit(y)
print("Best config:", hw.best_["config"])
print("Prediction:", hw.predict(h))

AutoSSA

Singular Spectrum Analysis - decomposes time series using SVD to discover adaptive oscillatory modes.

from randomstatsmodels import AutoSSA

ssa = AutoSSA(
    window_fracs=(0.25, 0.33, 0.5),
    n_components_grid=(None, 2, 4, 8),
)
ssa.fit(y)
print("Best config:", ssa.best_["config"])
print("Prediction:", ssa.predict(h))

AutoLocalLinear

Weighted local regression with exponential decay for older observations.

from randomstatsmodels import AutoLocalLinear

ll = AutoLocalLinear(
    decay_grid=(0.9, 0.95, 0.98, 1.0),
    degree_grid=(1, 2),
)
ll.fit(y)
print("Best config:", ll.best_["config"])
print("Prediction:", ll.predict(h))

AutoEnsemble

Combines multiple base forecasters with learned weights using validation performance.

from randomstatsmodels import AutoEnsemble

ensemble = AutoEnsemble(
    weighting_options=("uniform", "validation", "optimal"),
)
ensemble.fit(y)
print("Best config:", ensemble.best_["config"])
print("Base model scores:", ensemble.base_scores_)
print("Prediction:", ensemble.predict(h))

Weighting methods:

  • "uniform": Equal weights for all models
  • "validation": Weights inversely proportional to validation error
  • "optimal": Solve for weights that minimize validation error

AutoRIFT (Novel: Recursive Information Flow Tensor)

A cutting-edge forecasting model based on original "Predictive Information Field Dynamics" theory.

RIFT introduces a fundamentally new paradigm: instead of modeling values directly, it models how predictive information flows and transforms between different temporal channels (level, trend, curvature, oscillations) as the forecast horizon increases.

from randomstatsmodels import AutoRIFT

rift = AutoRIFT(
    n_frequencies_grid=(2, 4, 6),
    embedding_dim_grid=(2, 3),
    regularization_grid=(0.001, 0.01),
)
rift.fit(y)
print("Best config:", rift.best_["config"])
print("Prediction:", rift.predict(h))

# Analyze which channels hold predictive information
info = rift.get_information_analysis(horizon=5)
print("Information by channel:", info)

Theoretical Innovation:

  • Information Channels: Decomposes predictive power into orthogonal channels (level, trend, curvature, spectral components)
  • Information Flow Matrix: Learns how information transfers between channels as horizon increases
  • Fisher Information Estimation: Uses local variance reduction to estimate channel informativeness
  • Adaptive Reconstruction: Combines channel extrapolations weighted by propagated information state

Metrics

Available out of the box:

from randomstatsmodels.metrics import mae, mse, rmse, mape, smape

License

MIT © 2025 Jacob Wright

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

randomstatsmodels-1.3.0.tar.gz (61.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

randomstatsmodels-1.3.0-py3-none-any.whl (76.1 kB view details)

Uploaded Python 3

File details

Details for the file randomstatsmodels-1.3.0.tar.gz.

File metadata

  • Download URL: randomstatsmodels-1.3.0.tar.gz
  • Upload date:
  • Size: 61.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.1

File hashes

Hashes for randomstatsmodels-1.3.0.tar.gz
Algorithm Hash digest
SHA256 fcb72f2eca025abd0b17a5b07d661e107fe6d01db2738a249d019c87401e8f2b
MD5 f3843c8eaa1bacf109460c696f6cebcb
BLAKE2b-256 cd35af04bef1f663fcfce75c460258b887483186c8859058c18ff6a722317a2b

See more details on using hashes here.

File details

Details for the file randomstatsmodels-1.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for randomstatsmodels-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bd5691afdbbd09de794805833b89246df76d459da93ae64f0566f7f0f03e1ac7
MD5 4326ac14478f4a18368f5684a4d265b5
BLAKE2b-256 3be546366f17945983854729f7ffdfde75eeb90e5ec9aa8cd14721a07a1d847d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page