Skip to main content

Neural Bayesian Inference

Project description

License PyPI Tests Documentation Status

nbi: neural bayesian inference

Documentation

Do you have challenging inference problems that are difficult to solve with standard optimization and/or MCMC methods? Are you looking to fit the same forward model to thousands or millions of observed targets? nbi may be your solution.

nbi is an engine for Neural Posterior Estimation (NPE) focused on out-of-the-box functionality for astronomical data, particularly light curves and spectra. nbi provides effective embedding/featurizer networks for spectra and light-curve data, along with importance-sampling integration that enables asymptotically exact inference so that the inference results are interpretable and trustworthy.

Installation

You may either install nbi with pip install nbi or directly from source. As nbi is currently under active development, installing from source may be preferable at this stage.

git clone https://github.com/kmzzhang/nbi.git
cd nbi
pip install .

If you are using Mac ARM CPU (i.e. M1/M2/M3), you might want to install PyTorch from source and disable NNPACK, which is known to reduce performance (see issue). Note that currently the MPS Also support for weight_norm on Mac M1-M3 GPUs is recently implemented but has not been included in a stable release yet. Installing the nightly version from source also enables weight_norm for the MPS device.

git clone --recursive https://github.com/pytorch/pytorch
cd pytorch
USE_NNPACK=0 python setup.py install

Quick Start

The examples/ directory contains complete examples that demonstrates the functionality of nbi. A bare-bone example below illustrates the basic API, which follows the scikit-learn style. The default featurizer network for sequential data is resnet-gru, which is a hybrid CNN-RNN architecture.

Here are a rule of thumb for resnet-gru hyperparameters:

  • dim_in: this is your number of input data channels
  • depth: number of ResNet blocks. Start near log2(L)-5, where L is length of your sequential data.
  • max_hidden: Maximum hidden dimensions for ResNet. Hidden dimensions double (from hidden_conv=32 by default) every depth. At least a few times D^2, where D is the dimension of the physical parameter space.
import nbi

# hyperparameters
featurizer = {
    "type": "resnet-gru",
    "dim_in": 1,
    "max_hidden": 64
}

flow = {
    "n_dims": 1,        # parameter space dimension
    "flow_hidden": 32,  # generally no larger than max_hidden
    "num_blocks": 10    # depends on complexity of posterior shape
}

engine = nbi.NBI(
    flow,
    featurizer,
    simulator,
    noise,
    priors,
    device='cpu'        # 'cuda', 'cuda:0', 'mps' for M1/M2 Mac GPU
)
engine.fit(
    n_sims=1000,
    n_rounds=1,
    n_epochs=100
)
y_pred, weights = engine.predict(x_obs, x_err, n_samples=2000)

References

nbi: the Astronomer's Package for Neural Posterior Estimation (Zhang et al. 2023). Accepted to the "Machine Learning for Astrophysics" workshop at the 2023 International Conference for Machine Learning (ICML). Will be posted to arXiv soon.

Masked Autoregressive Flow for Density Estimation (Papamakarios et al. 2017)
https://arxiv.org/abs/1705.07057

Featurizers: ResNet (He et al. 2015; https://arxiv.org/abs/1512.03385), Gated Recurrent Units (GRU; Cho et al. 2014; https://arxiv.org/abs/1406.1078), ResNet-GRU (Zhang et al. 2021; https://iopscience.iop.org/article/10.3847/1538-3881/abf42e)

Acknowledgments

The nbi package is expanded from code originally written for ''Real-time Likelihood-free Inference of Roman Binary Microlensing Events with Amortized Neural Posterior Estimation''' (Zhang et al. 2021). The Masked Autoregressive Flow in this package is partly adapted from the implementation in https://github.com/kamenbliznashki/normalizing_flows. Work on this project was supported by the National Science Foundation award #2206744 ("CDS&E: Accelerating Astrophysical Insight at Scale with Likelihood-Free Inference").

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbi-0.4.1.tar.gz (26.5 kB view details)

Uploaded Source

Built Distribution

nbi-0.4.1-py3-none-any.whl (25.1 kB view details)

Uploaded Python 3

File details

Details for the file nbi-0.4.1.tar.gz.

File metadata

  • Download URL: nbi-0.4.1.tar.gz
  • Upload date:
  • Size: 26.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for nbi-0.4.1.tar.gz
Algorithm Hash digest
SHA256 e3e847a47d1cc164899a98dfcad120622f349805391c44a50281fc37f6a10d96
MD5 f484c8ae33e115fb62422e4b469d2991
BLAKE2b-256 295c9c752fa746ba5bb54ee133deeb769f864134481bbcd78c649f76a856e06f

See more details on using hashes here.

File details

Details for the file nbi-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: nbi-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 25.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for nbi-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 91df5a53d8bcb99f1fe5cac7783ef4b4dcb2cd91e47dc47a002ac858d80e5aa0
MD5 84dfdb2fcfb015dcbcf117e13e834c75
BLAKE2b-256 966392694586dc34f001aaf1bc388a474a632b373bdfc351ba9a20f6a9641576

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page