Normalizing flow implementations in jax.
Project description
flowjax
Normalising flows in JAX. Training a flow can be done in a few lines of code:
from flowjax.flows import BlockNeuralAutoregressiveFlow
from flowjax.train_utils import train_flow
from flowjax.distributions import Normal
from jax import random
data_key, flow_key, train_key = random.split(random.PRNGKey(0), 3)
x = random.uniform(data_key, (10000, 3)) # Toy data
base_dist = Normal(3)
flow = BlockNeuralAutoregressiveFlow(flow_key, base_dist)
flow, losses = train_flow(train_key, flow, x, learning_rate=0.05)
# We can now evaluate the log-probability of arbitrary points
flow.log_prob(x)
The package currently supports the following:
- Supports both
CouplingFlow
(Dinh et al., 2017) andMaskedAutoregressiveFlow
(Papamakarios et al., 2017) architectures - Supports common transformers, such as
AffineTransformer
andRationalQuadraticSplineTransformer
(the latter used in neural spline flows; Durkan et al., 2019) BlockNeuralAutoregressiveFlow
, as introduced by De Cao et al., 2019
For more detailed examples, see examples.
Installation
pip install flowjax
Warning
This package is new and may have substantial breaking changes between major releases.
TODO
A few limitations / things that could be worth including in the future:
- Support embedding networks (for dimensionality reduction of conditioning variables)
- Add batch/layer normalisation to neural networks
- Training script for variational inference
- Add documentation
Related
We make use of the Equinox package, which facilitates object-oriented programming with Jax.
Authors
flowjax
was written by Daniel Ward <danielward27@outlook.com>
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
flowjax-4.0.2.tar.gz
(16.1 kB
view hashes)
Built Distribution
flowjax-4.0.2-py3-none-any.whl
(19.8 kB
view hashes)