Skip to main content

A Machine learning library

Project description

Bayesian Linear Regression with Hamiltonian Monte Carlo (HMC)

This repository contains code for performing Bayesian Linear Regression using Hamiltonian Monte Carlo (HMC) sampling. Bayesian Linear Regression is a probabilistic approach to regression modeling that provides uncertainty estimates for the model parameters. HMC is a Markov Chain Monte Carlo (MCMC) method that uses Hamiltonian dynamics to generate samples from the posterior distribution.

Overview

Bayesian Linear Regression with HMC combines the flexibility of Bayesian modeling with the efficiency of HMC sampling to estimate the posterior distribution of regression coefficients. This allows for principled uncertainty quantification and robust inference in regression tasks.

How to Use

To use Bayesian Linear Regression with HMC, follow these steps:

  1. Prepare Your Data: Ensure your input features X and target variable y are in the appropriate format. X should be a 2D numpy array where each row represents a sample and each column represents a feature. y should be a 1D numpy array containing the corresponding target values.

  2. Instantiate the Model: Create an instance of the BayesianLinearRegression class.

  3. Perform Regression: Call the hamiltonian_monte_carlo method of the model instance, passing your data and desired parameters such as the number of samples (num_samples) and leapfrog steps (L).

  4. Analyze Results: Analyze the posterior samples obtained from the regression to make predictions, compute metrics, and assess uncertainty.

Theoretical Details

  • Bayesian Linear Regression: Bayesian Linear Regression is a probabilistic approach to regression modeling that treats the regression coefficients as random variables and estimates their posterior distribution using Bayes' theorem.

  • Hamiltonian Monte Carlo (HMC): HMC is a Markov Chain Monte Carlo (MCMC) method that simulates Hamiltonian dynamics to sample from complex probability distributions. It employs a "Hamiltonian" function that combines the potential energy (negative log-posterior) and kinetic energy to guide the exploration of the parameter space.

Requirements

  • Python 3.x
  • NumPy
  • SciPy

References

  • Neal, R. M. (2011). MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 2(11), 2.
  • Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2013). Bayesian Data Analysis (3rd ed.). Chapman and Hall/CRC.

License

This project is licensed under the MIT License - see the LICENSE.md file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cortexflow-2.0.1.tar.gz (3.6 kB view details)

Uploaded Source

File details

Details for the file cortexflow-2.0.1.tar.gz.

File metadata

  • Download URL: cortexflow-2.0.1.tar.gz
  • Upload date:
  • Size: 3.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for cortexflow-2.0.1.tar.gz
Algorithm Hash digest
SHA256 b33563796539b3015f1e9f3e93637ad31315db0d9266921fc1b50dfd5223cac4
MD5 5a5b4ad980312180219ca084e39342e5
BLAKE2b-256 c5fd2afc4ca0a301d9a6a4b96f091a9e230be3a81e434c7c22cc422087693021

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page