Physics-Informed BERT-style Transformer for Multiscale PDE Modeling
Project description
PIBERT: Physics-Informed BERT-style Transformer
PIBERT: A Physics-Informed Transformer with Hybrid Spectral Embeddings for Multiscale PDE Modeling
Introduction
PIBERT (Physics-Informed BERT-style Transformer) is a novel framework for solving multiscale partial differential equations (PDEs) that integrates hybrid spectral embeddings (combining Fourier and Wavelet approaches), physics-biased attention mechanisms, and self-supervised pretraining.
Unlike existing approaches that only partially address the multiscale challenge, PIBERT unifies three major innovations:
- A hybrid Fourier-Wavelet embedding that captures both global structures and localized phenomena
- A physics-informed attention bias derived from PDE residuals
- A dual-task self-supervised pretraining strategy (Masked Physics Prediction & Equation Consistency Prediction)
These innovations enable PIBERT to generalize beyond specific PDEs, outperform baselines on sparse or complex datasets, and capture dynamic multiscale structure in a stable and interpretable latent space.
Key Features
- Hybrid Spectral Embeddings: Combines Fourier and Wavelet transforms to capture both global patterns and localized features
- Physics-Biased Attention: Incorporates PDE residuals directly into attention calculation for physically consistent predictions
- Self-Supervised Pretraining: Includes Masked Physics Prediction (MPP) and Equation Consistency Prediction (ECP) tasks
- Multiscale Modeling: Designed specifically for PDEs with rich multiscale behavior
- Hardware-Aware Implementation: Works across different hardware configurations
Hardware Requirements
PIBERT is designed to be accessible across different hardware configurations:
| Task | Minimum (GTX 3060) | Recommended (A100) | Notes |
|---|---|---|---|
| Model Inference (2D) | ✓ | ✓ | 64×64 grids work on both |
| Model Training (2D) | ✓ (small batches) | ✓ | GTX 3060 requires gradient checkpointing |
| 3D Problem Inference | ✗ | ✓ | Requires 40+ GB VRAM |
| Pretraining | ✗ | ✓ | Not feasible on consumer GPUs |
Installation
# Basic installation
pip install pibert
# For development with testing and documentation tools
pip install pibert[dev]
# For full functionality including wavelet transforms
pip install pibert[full]
Quick Start
Verify installation with CPU (runs in <60s on any system):
from pibert import PIBERT
from pibert.utils import load_dataset
# Load a small sample dataset
dataset = load_dataset("reaction_diffusion")
# Initialize a small model
model = PIBERT(
input_dim=1,
hidden_dim=64,
num_layers=2,
num_heads=4
)
# Perform prediction
pred = model.predict(dataset["test"]["x"][:1], dataset["test"]["coords"][:1])
print(f"Prediction shape: {pred.shape}")
For more examples, see the examples directory.
Performance Comparison
PIBERT demonstrates state-of-the-art performance across multiple benchmarks:
1D Reaction Equation
| Model | Relative L1 | Relative L2 | MAE |
|---|---|---|---|
| PINN | 0.0651 | 0.0803 | 0.0581 |
| FNO | 0.0123 | 0.0150 | 0.0100 |
| Transformer | 0.0225 | 0.0243 | 0.0200 |
| PINNsFormer | 0.0065 | 0.0078 | 0.0060 |
| PIBERT | 0.0061 | 0.0074 | 0.0056 |
CFDBench (Cavity Flow)
| Model | MSE(u) | MSE(v) | MSE(p) |
|---|---|---|---|
| PINNs | 0.0500 | 0.0300 | 0.01500 |
| Spectral PINN | 0.0200 | 0.0045 | 0.00085 |
| FNO | 0.0113 | 0.0012 | 0.00021 |
| PINNsFormer | 0.0065 | 0.0007 | 0.00003 |
| PIBERT(Lite) | 0.0103 | 0.0011 | 0.000046 |
Ablation Study Results
The ablation study confirms the importance of each component:
| Model Variant | MSE (Test) | NMSE (Test) |
|---|---|---|
| PIBERT (Full) | 0.4975 | 1.3409 |
| Fourier-only | 1.6520 | 12.4010 |
| Wavelet-only | 0.4123 | 1.1021 |
| Standard-attention | 1.3201 | 9.8760 |
| FNO | 1.8099 | 13.5830 |
| UNet | 3.7006 | 29.2627 |
Disabling the physics-biased attention mechanism leads to a significant performance drop: test MSE increases from 0.4975 to 1.3201, and NMSE jumps from 1.34 to 9.88.
Reproducibility
All results in the paper can be reproduced using the provided code. The ablation studies were verified on a GTX 3060 (12GB VRAM), while the full-scale experiments used A100 GPUs. We provide configuration files for both hardware setups.
To reproduce the ablation study:
jupyter notebook examples/ablation_study_gpu.ipynb
Citing PIBERT
If you find PIBERT useful in your research, please cite our paper:
@article{chakraborty2024pibert,
title={PIBERT: A Physics-Informed Transformer with Hybrid Spectral Embeddings for Multiscale PDE Modeling},
author={Chakraborty, Somyajit and Xizhong, Chen},
year={2025}
}
License
Distributed under the Apache 2.0 License. See LICENSE for more information.
Support
For support and questions, please open an issue on GitHub or contact the authors:
- Somyajit Chakraborty: somyajit.chakraborty@example.com
- Chen Xizhong: chen.xizhong@example.com
PIBERT is developed at Shanghai Jiao Tong University, Department of Chemistry and Chemical Engineering
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pibert-0.1.1.tar.gz.
File metadata
- Download URL: pibert-0.1.1.tar.gz
- Upload date:
- Size: 17.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0845c8bf9c69d9cc566fb083a3acab399985b52d96ade5ebd9f7e9f0dea61c63
|
|
| MD5 |
d4354ef9341843efd5b25c3b54d893ab
|
|
| BLAKE2b-256 |
10875f9c16ffa0a097bb61aa5904e6f15950fe565b0de5ce60fa82c0f8c97572
|
File details
Details for the file pibert-0.1.1-py3-none-any.whl.
File metadata
- Download URL: pibert-0.1.1-py3-none-any.whl
- Upload date:
- Size: 16.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1705e49c2c6304465ebd7956fed3a2f6b96d101cabbacdbf48d728cc320e248e
|
|
| MD5 |
f5dea36ac5baf58d861d5cf85b923370
|
|
| BLAKE2b-256 |
e03d5a2d3bc1837cdb2fe159668a2fc6892f3873987522d55c898a0bf23060c6
|