Library with implementations of various ensembling methods.
Project description
Bensemble: Modular Bayesian Deep Learning & Ensembling
Bensemble is a production-ready, lightweight library for Bayesian Deep Learning and Neural Network Ensembling.
Key Resources
| Resource | Description |
|---|---|
| ๐ Documentation | Full API reference and user guides. |
| ๐ Tech Report | In-depth technical details and theoretical background. |
| โ๏ธ Blog Post | Summary of the project and motivation. |
| ๐ Benchmarks | Comparison of methods on standard datasets. |
Features
- PyTorch-Native: No hidden training loops. Use standard PyTorch to train your models, and use Bensemble for inference, ensembling, and analytics.
- Unified Ensembling API: Seamlessly combine explicit models (Deep Ensembles, NAS) and implicit methods (MC Dropout) via a single
Ensembleinterface. - Neural Ensemble Search (NES): Algorithms to automatically search for diverse architectures using NNI and Stein Variational Gradient Descent (SVGD).
- Uncertainty Analytics: Principled decomposition of predictive uncertainty into aleatoric (data noise) and epistemic (model ignorance) components.
- Model Calibration & Metrics: Evaluate models using Expected Calibration Error (ECE), Brier Score, and NLL. Fix overconfident networks post-hoc with Temperature and Vector Scaling.
Installation
You can install bensemble using pip:
pip install bensemble
Or, using uv for lightning-fast installation:
uv pip install bensemble
Quick Start
Example 1: Ensembling, Calibration & Uncertainty
Easily ensemble standard PyTorch models, calibrate them, and decompose their uncertainty to detect Out-Of-Distribution data.
import torch
import torch.nn as nn
from bensemble.core.ensemble import Ensemble
from bensemble.calibration.scaling import TemperatureScaling
from bensemble.uncertainty import decompose_classification_uncertainty
from bensemble.metrics import expected_calibration_error
# 1. Create a Deep Ensemble from standard trained PyTorch models
models = [nn.Sequential(nn.Linear(10, 20), nn.ReLU(), nn.Linear(20, 3)) for _ in range(5)]
ensemble = Ensemble.from_models(models)
# 2. Calibrate the ensemble using a hold-out validation set
val_logits, val_labels = torch.randn(100, 3), torch.randint(0, 3, (100,))
scaler = TemperatureScaling(init_temp=1.5).fit(val_logits, val_labels)
# 3. Predict on test data
test_x = torch.randn(10, 10)
# Returns shape: [5 models, 10 batch_size, 3 classes]
logits = scaler(ensemble.predict_members(test_x))
probs = torch.softmax(logits, dim=-1)
# 4. Decompose Uncertainty & Evaluate
total, aleatoric, epistemic = decompose_classification_uncertainty(probs)
ece = expected_calibration_error(probs.mean(dim=0), val_labels[:10])
print(f"Calibration Error (ECE): {ece:.4f}")
print(f"Epistemic Uncertainty (OOD awareness): {epistemic.mean().item():.4f}")
Example 2: Variational Inference
Build a Bayesian Neural Network from scratch using our custom layers with the Local Reparameterization Trick.
import torch
import torch.nn as nn
from torch.utils.data import DataLoader, TensorDataset
from bensemble.layers import BayesianLinear
from bensemble.losses import VariationalLoss, GaussianLikelihood
from bensemble.utils import get_total_kl, predict_with_uncertainty
# 1. Define Model using Bayesian Layers
model = nn.Sequential(
BayesianLinear(10, 50, prior_sigma=1.0),
nn.ReLU(),
BayesianLinear(50, 1, prior_sigma=1.0),
)
# 2. Define Objectives (Likelihood + Divergence)
likelihood = GaussianLikelihood()
criterion = VariationalLoss(likelihood, alpha=1.0)
optimizer = torch.optim.Adam(list(model.parameters()) + list(likelihood.parameters()), lr=0.01)
# 3. Standard PyTorch Training Loop
model.train()
for epoch in range(50): # Dummy loop
x, y = torch.randn(10, 10), torch.randn(10, 1)
optimizer.zero_grad()
loss = criterion(model(x), y, get_total_kl(model))
loss.backward()
optimizer.step()
# 4. Predict with Uncertainty
mean, std = predict_with_uncertainty(model, torch.randn(5, 10), num_samples=100)
print(f"Prediction: {mean[0].item():.2f} ยฑ {std[0].item():.2f}")
Algorithms & Demos
We implement a wide range of Bayesian and Ensembling approaches. Check out the interactive demos in the notebooks/ directory:
| Method | Description |
|---|---|
| Deep Ensembles | Naive yet powerful ensembling of independent networks with explicit uncertainty decomposition. |
| Monte Carlo Dropout | Implicit ensembling by keeping dropout active at test time. |
| Neural Ensemble Search (NES) | Automatically searches for diverse architectures (NES-RS/NES-RE) using NNI. |
| NES via Bayesian Sampling | Extracts diverse subnetworks from a Supernet using Stein Variational Gradient Descent (SVGD). |
| Variational Inference | Approximates posterior using Gaussian distributions with the Local Reparameterization Trick. |
| Variational Rรฉnyi | Generalization of VI minimizing $\alpha$-divergence (VR-VI) for better robustness. |
| Laplace Approximation | Fits a Gaussian around the MAP estimate using Kronecker-Factored Curvature (K-FAC). |
| Probabilistic Backprop | Propagates moments through the network using Assumed Density Filtering (ADF). |
Structure
bensemble/
โโโ core/ # Base protocols, ensemble abstractions, and adapters
โ โโโ ensemble.py # Central `Ensemble` class
โ โโโ member.py # Adapters for explicit and stochastic models
โ
โโโ layers/ # Bayesian Layers for Variational Inference
โ โโโ linear.py # Bayesian Linear layer
โ โโโ conv.py # Bayesian Convolution layer
โ
โโโ search/ # Neural Ensemble Search algorithms
โ โโโ nes.py # NES-RS & NES-RE
โ โโโ bayesian.py # NES via Bayesian Sampling
โ
โโโ diversity/ # Methods to induce ensemble variation
โ โโโ dropout.py # Monte Carlo Dropout wrapper
โ
โโโ uncertainty/ # Uncertainty analysis
โ โโโ decomposition.py # Separation of Aleatoric and Epistemic uncertainty
โ
โโโ calibration/ # Post-hoc model calibration tools
โ โโโ scaling.py # Temperature Scaling and Vector Scaling
โ
โโโ metrics.py # Scoring rules: ECE, NLL, Brier Score
Development Setup
If you want to contribute to bensemble or run tests, we recommend using uv.
# 1. Clone the repository
git clone https://github.com/intsystems/bensemble.git
cd bensemble
# 2. Create and activate virtual environment via uv
uv venv
source .venv/bin/activate # on Windows: .venv\Scripts\activate
# 3. Install in editable mode with dev dependencies
uv pip install -e ".[dev]"
Run Tests
pytest tests/
Linting
We use ruff to keep code clean:
ruff check .
ruff format .
Authors
Developed by:
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bensemble-0.2.0.tar.gz.
File metadata
- Download URL: bensemble-0.2.0.tar.gz
- Upload date:
- Size: 34.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b139a6cce56b1e905cc6944a6959cc127d4e17a528aeede53a6a9cf2e9bec03b
|
|
| MD5 |
730f738162e8c1262a9201eced8a7303
|
|
| BLAKE2b-256 |
fbd6d7ace9ed6dbdf378f7ec0e602cc4d68015e27c4fa2db94ab5a50b441062f
|
File details
Details for the file bensemble-0.2.0-py3-none-any.whl.
File metadata
- Download URL: bensemble-0.2.0-py3-none-any.whl
- Upload date:
- Size: 35.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3e1b8732edf68fa5115a72115cb2d450cd3aa87a9e1815bc704ffb40a5eafddc
|
|
| MD5 |
300057de838641a54e970bed32a5dc18
|
|
| BLAKE2b-256 |
466722333c4db590170a32b60ee57690dd4a8c81da7d3078c83378ed2be54d41
|