Skip to main content

Easy to use distributions, bijections and normalizing flows in JAX.

Project description

logo

FlowJax: Distributions and Normalizing Flows in Jax

Documentation

Available here.

Short example

Training a flow can be done in a few lines of code:

from flowjax.flows import block_neural_autoregressive_flow
from flowjax.train import fit_to_data
from flowjax.distributions import Normal
from jax import random
import jax.numpy as jnp

data_key, flow_key, train_key, sample_key = random.split(random.PRNGKey(0), 4)

x = random.uniform(data_key, (5000, 2))  # Toy data
base_dist = Normal(jnp.zeros(x.shape[1]))
flow = block_neural_autoregressive_flow(flow_key, base_dist=base_dist)
flow, losses = fit_to_data(
    key=train_key,
    dist=flow,
    x=x,
    learning_rate=5e-3,
    max_epochs=200,
    )

# We can now evaluate the log-probability of arbitrary points
log_probs = flow.log_prob(x)

# And sample the distribution
samples = flow.sample(sample_key, (1000, ))

The package currently includes:

  • Many simple bijections and distributions, implemented as Equinox modules.
  • coupling_flow (Dinh et al., 2017) and masked_autoregressive_flow (Kingma et al., 2016, Papamakarios et al., 2017) normalizing flow architectures.
    • These can be used with arbitrary bijections as transformers, such as Affine or RationalQuadraticSpline (the latter used in neural spline flows; Durkan et al., 2019).
  • block_neural_autoregressive_flow, as introduced by De Cao et al., 2019.
  • planar_flow, as introduced by Rezende and Mohamed, 2015.
  • triangular_spline_flow, introduced here.
  • Training scripts for fitting by maximum likelihood, variational inference, or using contrastive learning for sequential neural posterior estimation (Greenberg et al., 2019; Durkan et al., 2020).
  • A bisection search algorithm that allows inverting some bijections without a known inverse, allowing for example both sampling and density evaluation to be performed with block neural autoregressive flows.

Installation

pip install flowjax

Warning

This package is in its early stages of development and may undergo significant changes, including breaking changes, between major releases. Whilst ideally we should be on version 0.y.z to indicate its state, we have already progressed beyond that stage.

Development

We can install a version for development as follows

git clone https://github.com/danielward27/flowjax.git
cd flowjax
pip install -e .[dev]
sudo apt-get install pandoc  # Required for building documentation

Related

We make use of the Equinox package, which facilitates defining models using a PyTorch-like syntax with Jax.

Citation

If you found this package useful in academic work, please consider citing it using the template below, filling in [version number] and [release year of version] to the appropriate values. Version specific DOIs can be obtained from zenodo if desired.

@software{ward2023flowjax,
  title = {FlowJax: Distributions and Normalizing Flows in Jax},
  author = {Daniel Ward},
  url = {https://github.com/danielward27/flowjax},
  version = {[version number]},
  year = {[release year of version]},
  doi = {10.5281/zenodo.10402073},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowjax-12.1.2.tar.gz (777.2 kB view hashes)

Uploaded Source

Built Distribution

flowjax-12.1.2-py3-none-any.whl (57.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page