Enterprise-grade time series forecasting package with ARIMA, Prophet, and LSTM models
Project description
Forecaster AI 📈
An enterprise-grade time series forecasting package with automatic decomposition, MLOps capabilities, and support for multiple forecasting models including ARIMA, Prophet, and LSTM.
✨ Key Features
🎯 Advanced Forecasting
- Automatic Time Series Decomposition (STL, Classical) - NEW! ⭐
- Multiple Models: ARIMA, Prophet, LSTM, and Ensemble methods
- Intermittent Demand Forecasting: Croston's, SBA, TSB methods
- New Product Forecasting: Cold start with bootstrapping
- Adaptive Pattern Detection: Automatically chooses best approach
🔄 Data Processing
- Smart Decomposition: Separates trend, seasonality, and residuals
- Outlier Detection: IQR, Z-score, MAD methods
- Missing Value Handling: Multiple imputation strategies
- Data Validation: Comprehensive quality checks
🚀 MLOps Integration
- Experiment Tracking: MLflow integration
- Model Registry: Version control and lifecycle management
- Performance Monitoring: Real-time drift detection
- REST API: FastAPI-based endpoints
🌐 Production Ready
- Docker Support: Containerized deployment
- CI/CD Pipeline: Automated testing and deployment
- Monitoring: Prometheus and Grafana integration
- Scalability: Horizontal scaling support
📦 Installation
pip install forecaster-ai
From source
git clone https://github.com/surya08084/forecaster-ai.git
cd forecasting-package
pip install -e .
🚀 Quick Start
Basic Forecasting with Decomposition
import pandas as pd
from forecasting.core.config import ForecastConfig, PreprocessingConfig
from forecasting.models.prophet import ProphetForecaster
# Prepare data
dates = pd.date_range('2023-01-01', periods=365, freq='D')
values = [100 + i * 0.5 + 20 * np.sin(2 * np.pi * i / 7) for i in range(365)]
data = pd.Series(values, index=dates, name='sales')
# Configure with automatic decomposition
config = ForecastConfig(
model_type='prophet',
horizon=30,
frequency='D',
preprocessing=PreprocessingConfig(
enable_decomposition=True, # Enable decomposition
decomposition_method='stl', # STL or 'classical'
decomposition_model='additive', # 'additive' or 'multiplicative'
seasonal_period=7, # Weekly seasonality
handle_outliers=True
)
)
# Create and train model
model = ProphetForecaster(config)
model.fit(data)
# Make predictions (automatically reconstructed to original scale)
predictions, conf_intervals = model.predict(horizon=30)
# Access decomposition components
components = model.get_decomposition_components()
print("Trend:", components['trend'])
print("Seasonal:", components['seasonal'])
print("Residual:", components['residual'])
Standalone Time Series Decomposition
from forecasting.data.preprocessors import TimeSeriesDecomposer
# Create decomposer
decomposer = TimeSeriesDecomposer(
method='stl', # Robust to outliers
model='additive', # or 'multiplicative'
period=7 # Weekly seasonality
)
# Decompose time series
trend, seasonal, residual = decomposer.fit_transform(data)
# Reconstruct original
reconstructed = decomposer.reconstruct()
Intermittent Demand (Sparse Data)
from forecasting.data.special_cases import IntermittentDemandHandler
# For data with many zeros (retail, spare parts)
handler = IntermittentDemandHandler(
method='sba', # Syntetos-Boylan Approximation
alpha=0.1
)
handler.fit(sparse_data)
forecast = handler.predict(horizon=12)
New Product Forecasting (Cold Start)
from forecasting.data.special_cases import NewProductHandler
# Bootstrap from similar products
handler = NewProductHandler()
bootstrap_data = handler.bootstrap_from_similar(
similar_products={
'product_A': historical_data_A,
'product_B': historical_data_B
},
similarity_scores={
'product_A': 0.85,
'product_B': 0.72
}
)
# Use bootstrap data for forecasting
config = ForecastConfig(model_type='prophet', horizon=30)
model = ProphetForecaster(config)
model.fit(bootstrap_data)
predictions, _ = model.predict()
📖 Model Usage
ARIMA with Decomposition
from forecasting.core.config import ForecastConfig, PreprocessingConfig
from forecasting.models.arima import ARIMAForecaster
config = ForecastConfig(
model_type='arima',
horizon=30,
preprocessing=PreprocessingConfig(
enable_decomposition=True,
decomposition_method='stl'
),
model_params={
'order': (1, 1, 1), # (p, d, q)
'seasonal_order': (1, 1, 1, 7) # (P, D, Q, s)
}
)
model = ARIMAForecaster(config)
model.fit(data)
predictions, conf_intervals = model.predict()
Prophet with Custom Seasonality
config = ForecastConfig(
model_type='prophet',
horizon=30,
preprocessing=PreprocessingConfig(
enable_decomposition=True
),
model_params={
'seasonality_mode': 'multiplicative',
'yearly_seasonality': True,
'weekly_seasonality': True,
'daily_seasonality': False
}
)
model = ProphetForecaster(config)
model.fit(data)
predictions, conf_intervals = model.predict()
LSTM Deep Learning
config = ForecastConfig(
model_type='lstm',
horizon=30,
preprocessing=PreprocessingConfig(
enable_decomposition=True,
normalize=True
),
model_params={
'hidden_size': 64,
'num_layers': 2,
'dropout': 0.2,
'learning_rate': 0.001,
'epochs': 100
}
)
model = LSTMForecaster(config)
model.fit(data)
predictions, conf_intervals = model.predict()
🔧 Advanced Features
Data Validation
from forecasting.data.validators import TimeSeriesValidator
validator = TimeSeriesValidator()
is_valid, errors = validator.validate(data)
# Check stationarity
is_stationary, p_value = validator.check_stationarity(data)
# Detect seasonality
has_seasonality, period = validator.detect_seasonality(data)
# Detect outliers
outliers = validator.detect_outliers(data, method='iqr')
Evaluation Metrics
from forecasting.evaluation.metrics import ForecastMetrics
metrics = ForecastMetrics()
# Calculate metrics
mae = metrics.mae(actual, predicted)
rmse = metrics.rmse(actual, predicted)
mape = metrics.mape(actual, predicted)
smape = metrics.smape(actual, predicted)
mase = metrics.mase(actual, predicted, seasonal_period=7)
Backtesting
from forecasting.evaluation.backtesting import RollingOriginBacktester
backtester = RollingOriginBacktester(
initial_window=100,
horizon=10,
step=1
)
results = backtester.run(model, data)
print(f"Average RMSE: {results['avg_rmse']}")
MLflow Tracking
from forecasting.mlops.tracking import ExperimentTracker
tracker = ExperimentTracker(
experiment_name='sales_forecasting',
tracking_uri='http://localhost:5000'
)
with tracker.start_run():
model.fit(data)
tracker.log_params(config.get_model_params())
tracker.log_metrics({'rmse': rmse, 'mae': mae})
tracker.log_model(model, 'prophet_model')
🌐 REST API
Start API Server
# Using uvicorn
uvicorn forecasting.api.main:app --host 0.0.0.0 --port 8000
# Using Docker
docker-compose up
API Endpoints
import requests
# Health check
response = requests.get('http://localhost:8000/health')
# Make prediction
response = requests.post(
'http://localhost:8000/predict',
json={
'data': data.tolist(),
'horizon': 30,
'model_type': 'prophet',
'enable_decomposition': True
}
)
predictions = response.json()['predictions']
📊 Why Decomposition?
Benefits
- Better Accuracy: Models work on clean residuals
- Faster Training: Simpler patterns to learn
- Interpretability: Understand trend vs seasonality
- Robustness: Handles outliers better
When to Use
- ✅ Regular patterns: Daily, weekly, monthly seasonality
- ✅ Trending data: Long-term growth or decline
- ✅ Clean forecasts: Separate noise from signal
When to Skip
- ❌ Intermittent demand: Use Croston's methods instead
- ❌ New products: Use bootstrapping instead
- ❌ Very short series: Not enough data to decompose
The package automatically detects which approach to use!
📚 Documentation
- Quick Start Guide - Complete usage examples
- Migration Guide - Updating from old API
- PyPI Publishing - Publishing guide
- Examples - Jupyter notebooks and scripts
🔄 What's New in v0.2.2
⭐ Major Features
- Automatic Time Series Decomposition (STL & Classical methods)
- Adaptive Pattern Detection (Regular, Intermittent, New Product)
- Intermittent Demand Forecasting (Croston's, SBA, TSB)
- New Product Forecasting (Bootstrapping from similar products)
- Enhanced Preprocessing Pipeline (Outliers, normalization, decomposition)
🔧 Improvements
- Configuration-based API for consistency
- Automatic reconstruction of predictions
- Component access for analysis
- Better error handling
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
👤 Author
Surya Tripathi
- Email: suryaec1099@gmail.com
- GitHub: @surya08084
🙏 Acknowledgments
- STL decomposition based on Cleveland et al. (1990)
- Croston's method for intermittent demand
- Prophet by Facebook Research
- MLflow for experiment tracking
📈 Citation
If you use this package in your research, please cite:
@software{forecaster_ai,
author = {Tripathi, Surya},
title = {Forecaster AI: Enterprise Time Series Forecasting with Automatic Decomposition},
year = {2024},
url = {https://github.com/surya08084/forecaster-ai}
}
Made with ❤️ by Bob and Surya
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file forecaster_ai-0.2.4.tar.gz.
File metadata
- Download URL: forecaster_ai-0.2.4.tar.gz
- Upload date:
- Size: 95.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a03db003292a46af1ce55d4939b63a33a201c1386af4207de14f01d986fee31f
|
|
| MD5 |
00b62e03f5283457556f589ac317f23a
|
|
| BLAKE2b-256 |
b5dca7fec22e54582bfb33390a94e117cb927f1d75756f7101bd7163976034ea
|
File details
Details for the file forecaster_ai-0.2.4-py3-none-any.whl.
File metadata
- Download URL: forecaster_ai-0.2.4-py3-none-any.whl
- Upload date:
- Size: 119.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a787dd31fc3c2fcfddee6de74b4e5145e2dfff9d4602e46ceb13f2d105447ca4
|
|
| MD5 |
be253bc60ce4cda2d9ff149cba0a811c
|
|
| BLAKE2b-256 |
8c775fcc4d324286065ac7ffa5d45f8166e0d956d2dfed5fc2056bb5db1af239
|