Skip to main content

Bayesian layers for NumPyro and Jax

Project description

codecov License PyPI

BLayers

NOTE: BLayers is in alpha. Expect changes. Feedback welcome.

write code immediately

pip install blayers

deps are: numpy, numpyro and jax. optax is recommended.

concept

The missing layers package for Bayesian inference. Inspiration from Keras and Tensorflow Probability, but made specifically for Numpyro + Jax.

Easily build Bayesian models from parts, abstract away the boilerplate, and tweak priors as you wish.

Fit models either using Variational Inference (VI) or your sampling method of choice. Use BLayer's ELBO implementation to do either batched VI or sampling without having to rewrite models.

BLayers helps you write pure Numpyro, so you can integrate it with any Numpyro code to build models of arbitrary complexity. It also gives you a recipe to build more complex layers as you wish.

the starting point

The simplest non-trivial (and most important!) Bayesian regression model form is the adaptive prior,

lmbda ~ HalfNormal(1)
beta  ~ Normal(0, lmbda)
y     ~ Normal(beta * x, 1)

BLayers takes this as its starting point and most fundamental building block, providing the flexible AdaptiveLayer.

from blayers import AdaptiveLayer, gaussian_link_exp
def model(x, y):
    mu = AdaptiveLayer()('mu', x)
    return gaussian_link_exp(mu, y)

pure numpyro

All BLayers is doing is writing Numpyro for you under the hood. This model is exacatly equivalent to writing the following, just using way less code.

from numpyro import distributions, sample

def model(x, y):
    # Adaptive layer does all of this
    input_shape = x.shape[1]
    # adaptive prior
    lmbda = sample(
        name="lmbda",
        fn=distributions.HalfNormal(1.),
    )
    # beta coefficients for regression
    beta = sample(
        name="beta",
        fn=distributions.Normal(loc=0., scale=lmbda),
        sample_shape=(input_shape,),
    )
    mu = jnp.einsum('ij,j->i', x, beta)

    # the link function does this
    sigma = sample(name='sigma', fn=distributions.Exponential(1.))
    return sample('obs', distributions.Normal(mu, sigma), obs=y)

mixing it up

The AdaptiveLayer is also fully parameterizable via arguments to the class, so let's say you wanted to change the model from

lmbda ~ HalfNormal(1)
beta  ~ Normal(0, lmbda)
y     ~ Normal(beta * x, 1)

to

lmbda ~ Exponential(1.)
beta  ~ LogNormal(0, lmbda)
y     ~ Normal(beta * x, 1)

you can just do this directly via arguments

from numpyro import distributions,
from blayers import AdaptiveLayer, gaussian_link_exp
def model(x, y):
    mu = AdaptiveLayer(
        lmbda_dist=distributions.Exponential,
        prior_dist=distributions.LogNormal,
        lmbda_kwargs={'rate': 1.},
        prior_kwargs={'loc': 0.}
    )('mu', x)
    return gaussian_link_exp(mu, y)

"factories"

Since Numpyro traces sample sites and doesn't record any paramters on the class, you can re-use with a particular generative model structure freely.

from numpyro import distributions,
from blayers import AdaptiveLayer, gaussian_link_exp

my_lognormal_layer = AdaptiveLayer(
    lmbda_dist=distributions.Exponential,
    prior_dist=distributions.LogNormal,
    lmbda_kwargs={'rate': 1.},
    prior_kwargs={'loc': 0.}
)

def model(x, y):
    mu = my_lognormal_layer('mu1', x) + my_lognormal_layer('mu2', x**2)
    return gaussian_link_exp(mu, y)

additional layers

fixed prior layers

For you purists out there, we also provide a FixedPriorLayer for standard L1/L2 regression.

from blayers import FixedPriorLayer, gaussian_link_exp
def model(x, y):
    mu = FixedPriorLayer()('mu', x)
    return gaussian_link_exp(mu, y)

Very useful when you have an informative prior.

factorization machines

Developed in Rendle 2010 and Rendle 2011, FMs provide a low-rank approximation to the x-by-x interaction matrix. For those familiar with R syntax, it is an approximation to y ~ x:x, excluding the x^2 terms.

To fit the equivalent of an r model like y ~ x*x (all main effects, x^2 terms, and one-way interaction effects), you'd do

from blayers import FMLayer, gaussian_link_exp
def model(x, y):
    mu = (
        AdaptiveLayer('x', x) +
        AdaptiveLayer('x2', x**2) +
        FMLayer(low_rank_dim=3)('xx', x)
    )
    return gaussian_link_exp(mu, y)

uv decomp

We also provide a standard UV deccomp for low rank interaction terms

from blayers import LowRankInteractionLayer, gaussian_link_exp
def model(x, z, y):
    mu = (
        AdaptiveLayer('x', x) +
        AdaptiveLayer('z', z) +
        LowRankInteractionLayer(low_rank_dim=3)('xz', x, z)
    )
    return gaussian_link_exp(mu, y)

bayesian embeddings

links

We provide link functions as a convenience to abstract away a bit more Numpyro boilerplate.

We currently provide

  • gaussian_link_exp

batched loss

The default Numpyro way to fit batched VI models is to use plate, which confuses me a lot. Instead, BLayers provides Batched_Trace_ELBO which does not require you to use plate to batch in VI. Just drop your model in.

from blayers.infer import Batched_Trace_ELBO, svi_run_batched

svi = SVI(model_fn, guide, optax.adam(schedule), loss=loss_instance)

svi_result = svi_run_batched(
    svi,
    rng_key,
    num_steps,
    batch_size=1000,
    **model_data,
)

roadmap

  1. Fit helpers for models with categorical variables
  2. Multioutput models
  3. Examples
  4. More code re-use in layers.py (this will only become clear after more code is written)
  5. More link functions

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

blayers-0.1.0a2.tar.gz (15.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

blayers-0.1.0a2-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file blayers-0.1.0a2.tar.gz.

File metadata

  • Download URL: blayers-0.1.0a2.tar.gz
  • Upload date:
  • Size: 15.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.8

File hashes

Hashes for blayers-0.1.0a2.tar.gz
Algorithm Hash digest
SHA256 5adbb48aff0108a0a76ea2128af674cecf4aa6cb67e904f5e98095d92e8d6fbe
MD5 043f73bed180c9d38beaea94635c2c03
BLAKE2b-256 fcaadfa34eca095891f701d0621205b4eb48a35860c4937fc2988bee9b159e32

See more details on using hashes here.

File details

Details for the file blayers-0.1.0a2-py3-none-any.whl.

File metadata

  • Download URL: blayers-0.1.0a2-py3-none-any.whl
  • Upload date:
  • Size: 15.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.8

File hashes

Hashes for blayers-0.1.0a2-py3-none-any.whl
Algorithm Hash digest
SHA256 6980422bea448e692f422a5d6b04e332ecae179362510dd8b166bfe89e677366
MD5 504db8fb772b4d66fba923dbf1c3d9f3
BLAKE2b-256 e789041e6492980efd48af594b4ca74ea503e062d52fa4bb567ef85a3bf51878

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page