Skip to main content

Bayesian ensemble methods for neural networks

Project description

|test| |codecov| |docs|

.. |test| image:: https://github.com/intsystems/ProjectTemplate/workflows/test/badge.svg :target: https://github.com/intsystems/ProjectTemplate/tree/master :alt: Test status

.. |codecov| image:: https://img.shields.io/codecov/c/github/intsystems/ProjectTemplate/master :target: https://app.codecov.io/gh/intsystems/ProjectTemplate :alt: Test coverage

.. |docs| image:: https://github.com/intsystems/ProjectTemplate/workflows/docs/badge.svg :target: https://intsystems.github.io/ProjectTemplate/ :alt: Docs status

.. class:: center

:Название исследуемой задачи: Bayesian ensembling
:Тип научной работы: Учебный проект по курсу байесовского мультимоделирования
:Автор: Соболевский Федор, Набиев Мухаммадшариф, Василенко Дмитрий, Касюк Вадим
:Научный руководитель: к. ф.-м. н. Бахтеев Олег Юрьевич
:Научный консультант(при наличии): степень, Фамилия Имя Отчество

Abstract: Comparative Analysis of Bayesian Deep Learning Methods for Multimodel Ensembling

Bayesian methods provide a principled framework for uncertainty quantification in deep learning, which is crucial for safety-critical applications. This project presents a comprehensive comparative study of four prominent variational inference algorithms for Bayesian neural networks: the baseline Evidence Lower Bound (ELBO) with local reparameterization trick, hyperparameter optimization, and pruning; Renyi divergence as a generalization of ELBO offering tunable mode-seeking/covering behavior; scalable Laplace approximation for efficient posterior estimation; and Bayes by Backpropagation as an alternative variational inference scheme.

We implement these methods in a unified framework and evaluate them on benchmark classification and regression tasks using metrics including predictive accuracy, negative log-likelihood, expected calibration error, and out-of-distribution detection performance. Our results demonstrate the trade-offs between computational efficiency, uncertainty quantification quality, and model performance across different methods. Specifically, we show how Renyi divergence with α < 1 provides improved uncertainty calibration compared to standard ELBO, while scalable Laplace approximation offers competitive performance with reduced computational overhead. The study provides practical guidance for selecting appropriate Bayesian inference methods based on application requirements and computational constraints.

Keywords: Bayesian Deep Learning, Variational Inference, Uncertainty Quantification, Renyi Divergence, Laplace Approximation, Neural Networks Ensembling

Bayesian Multimodeling Project

A comprehensive comparative study of Bayesian deep learning methods for neural network ensembling and uncertainty quantification.

Project documentation can be found here: https://intsystems.github.io/bensemble/

Blog post draft: https://github.com/intsystems/bensemble/blob/master/paper/blogpost_draft/blog_post_draft.pdf

Techreport draft: https://github.com/intsystems/bensemble/blob/master/paper/techreport_draft/bensemble_Tech_Report_draft.pdf

Benchmark notebook: https://github.com/intsystems/bensemble/blob/master/notebooks/benchmark.ipynb

Demos

Practical Variational Inference: https://github.com/intsystems/bensemble/blob/master/notebooks/pvi_demo.ipynb

Kronecker-Factored Laplace Approximation: https://github.com/intsystems/bensemble/blob/master/notebooks/laplace_demo.ipynb

Variational Inference with Renyi Divergence: https://github.com/intsystems/bensemble/blob/master/notebooks/variatinal_renyi_demo.ipynb

Bayes by Backprop: https://github.com/intsystems/bensemble/blob/master/notebooks/pbp_probabilistic_backpropagation_test.ipynb

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bensemble-0.1.0.tar.gz (20.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bensemble-0.1.0-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file bensemble-0.1.0.tar.gz.

File metadata

  • Download URL: bensemble-0.1.0.tar.gz
  • Upload date:
  • Size: 20.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.10

File hashes

Hashes for bensemble-0.1.0.tar.gz
Algorithm Hash digest
SHA256 276a84eff1ea232f6be6b6c05113049954cbce0a153c93948989e20435916040
MD5 2b91693d9b28a02d31e54453648f66c5
BLAKE2b-256 d34c4cdf77df16cb759c091f2233e6e017b3b91faff708061beaf64e523fc400

See more details on using hashes here.

File details

Details for the file bensemble-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: bensemble-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 24.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.10

File hashes

Hashes for bensemble-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 801cf4f4fde7f5342dcc1e244d69b2aaee8ea0b92735222059fe83985cf9409a
MD5 2c8ee1576e5a993a0641e4e1a6898cb8
BLAKE2b-256 66c09454bec3627c12df0920d307c10a2d56ef0c7a1abcbf3ba2b6713b99eda2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page