Skip to main content

A transformer-based architecture for robust panel time series forecasting, extending the Temporal Fusion Transformer with multi-scale decomposition, Segment-wise Attention, cross-entity attention, and adaptive trend-seasonal modeling.

Project description

Panelformer

Project Research Lead Contributor Contributor Contributor

Panelformer is a transformer-based deep learning model designed for accurate and scalable panel time series forecasting. Built on top of the Temporal Fusion Transformer (TFT), Panelformer introduces several innovations to address the limitations of existing models when applied to heterogeneous panel datasets.


🔑 Keywords

Panel time series, Temporal Fusion Transformer, Transformer architecture, Forecasting, Panelformer


🔍 Overview

Panel time series involve multiple entities (e.g., countries, products) observed over time — requiring models to handle both temporal and cross-sectional complexity. Panelformer enhances forecasting accuracy across diverse panel structures by integrating the following features:

  • Segment-wise Attention: Reduces complexity and captures local patterns in long sequences.
  • Multi-Scale Series Decomposition: Separates trend and seasonal components using learnable moving averages.
  • Cross-Entity Attention: Models dependencies between different panel entities.
  • Parallel Trend-Seasonal Paths: Dedicated processing for distinct temporal dynamics.
  • Adaptive Component Weighting: Learns to prioritize seasonal vs. trend features dynamically.

🚀 Key Results

Evaluated on 11 real-world datasets spanning economics, energy, climate, and health domains, Panelformer achieved:

  • 7.99% average MAPE improvement over baseline TFT
  • Robust performance on balanced/unbalanced, short/long, micro/macro panels
  • Significant accuracy gains on high-frequency datasets like foreign exchange, surface temperature, and electricity

📦 Installation

pip install panelformer

🛠️ How to Use

Using Panelformer involves four main steps:

1. Data Preparation

Prepare your panel dataset with columns for entity IDs, timestamps, and target values. Convert your date column to datetime format and create a sequential time index for modeling.

2. Dataset Creation

Use the pytorch_forecasting library’s TimeSeriesDataSet class to define training, validation, and test datasets. This includes specifying encoder and decoder lengths, grouping by entity, and applying normalization.


🏋️ Step 3: Train the Model

Instantiate the Panelformer model using the prepared dataset, configure hyperparameters (like learning rate, hidden sizes, attention heads), and train with a PyTorch Lightning Trainer for efficient training management.

import lightning.pytorch as pl
from lightning.pytorch.callbacks import EarlyStopping, LearningRateMonitor
from lightning.pytorch.loggers import TensorBoardLogger
from pytorch_forecasting.metrics import QuantileLoss

early_stop_callback = EarlyStopping(monitor="val_loss", min_delta=1e-10, patience=5, verbose=False, mode="min")
lr_logger = LearningRateMonitor()
logger = TensorBoardLogger("lightning_logs")

trainer = pl.Trainer(
    max_epochs = #Integer
    enable_model_summary=True,
    gradient_clip_val=0.1,
    callbacks=[lr_logger, early_stop_callback],
    logger=logger,
)

model = Panelformer.from_dataset(
    training,
    learning_rate=0.001,
    hidden_size=16,
    attention_head_size=2,
    dropout=0.1,
    segment_size=4,
    decomposition_kernel_sizes=[3, 7, 15, 31],
    trend_processing_layers=2,
    use_cross_series_attention=True,
    adaptive_trend_weight=True,
    hidden_continuous_size=8,
    loss=QuantileLoss()
)

trainer.fit(model, train_dataloaders=train_dataloader, val_dataloaders=val_dataloader)

📈 Step 4: Make Predictions

Use the trained model to predict on new data by passing a test dataloader, retrieving forecasts for evaluation or downstream use.

predictions = model.predict(test_dataloader, mode="raw", return_x=True)


🛠 License

Released under the MIT License. Built on top of PyTorch Forecasting.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

panelformer-0.2.1.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

panelformer-0.2.1-py3-none-any.whl (15.3 kB view details)

Uploaded Python 3

File details

Details for the file panelformer-0.2.1.tar.gz.

File metadata

  • Download URL: panelformer-0.2.1.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for panelformer-0.2.1.tar.gz
Algorithm Hash digest
SHA256 1d33c4d8630daa788848eb90538a4c699bea128a61111a4f2afb3ed52949cf37
MD5 ed6ebb5c8c1b9b46f78ee5edea50984f
BLAKE2b-256 c02a528dfd0bece371f4562d61d0714880bca43b829987a013f6119736c482b6

See more details on using hashes here.

Provenance

The following attestation bundles were made for panelformer-0.2.1.tar.gz:

Publisher: release.yml on aaivu/Panelformer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file panelformer-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: panelformer-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 15.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for panelformer-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4e9220eaff4a2d51ff14bf87f16d9c3f2d28aa5d74b88f8122c5ac4723ef9f11
MD5 971f8d8b12b2bf9a9005aebfb3388de2
BLAKE2b-256 ef06b3c0a170a3c69cbbab164c5d55e5ef2656ba09cf61e6356a79700a56d528

See more details on using hashes here.

Provenance

The following attestation bundles were made for panelformer-0.2.1-py3-none-any.whl:

Publisher: release.yml on aaivu/Panelformer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page