Normalizing flow implementations in jax.
Project description
FlowJax: Normalising Flows in Jax
Documentation
Available here.
Short example
Training a flow can be done in a few lines of code:
from flowjax.flows import BlockNeuralAutoregressiveFlow
from flowjax.train_utils import train_flow
from flowjax.distributions import Normal
from jax import random
import jax.numpy as jnp
data_key, flow_key, train_key = random.split(random.PRNGKey(0), 3)
x = random.uniform(data_key, (10000, 3)) # Toy data
base_dist = Normal(jnp.zeros(x.shape[1]))
flow = BlockNeuralAutoregressiveFlow(flow_key, base_dist)
flow, losses = train_flow(train_key, flow, x, learning_rate=0.05)
# We can now evaluate the log-probability of arbitrary points
flow.log_prob(x)
The package currently supports the following:
CouplingFlow
(Dinh et al., 2017)MaskedAutoregressiveFlow
(Papamakarios et al., 2017) conditioner architectures.- Common "transformers", such as
Affine
andRationalQuadraticSpline
(the latter used in neural spline flows; Durkan et al., 2019) BlockNeuralAutoregressiveFlow
, as introduced by De Cao et al., 2019TriangularSplineFlow
, introduced here.
Installation
pip install flowjax
Warning
This package is new and may have substantial breaking changes between major releases.
TODO
A few limitations / things that could be worth including in the future:
- Add documentation
- Add ability to "reshape" bijections.
- Training script for variational inference
Related
We make use of the Equinox package, which facilitates object-oriented programming with Jax.
Authors
flowjax
was written by Daniel Ward <danielward27@outlook.com>
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
flowjax-7.0.0.tar.gz
(25.2 kB
view hashes)
Built Distribution
flowjax-7.0.0-py3-none-any.whl
(32.6 kB
view hashes)