Skip to main content

Normalizing flow implementations in jax.

Project description

flowjax


Normalising flows in JAX. Training a flow can be done in a few lines of code:

from flowjax.flows import block_neural_autoregressive_flow
from flowjax.train_utils import train_flow
from flowjax.distributions import Normal
from jax import random

data_key, flow_key, train_key = random.split(random.PRNGKey(0), 3)

x = random.uniform(data_key, (10000, 3))  # Toy data
base_dist = Normal(3)
flow = block_neural_autoregressive_flow(flow_key, base_dist)
flow, losses = train_flow(train_key, flow, x, learning_rate=0.05)

# We can now evaluate the log-probability of arbitrary points
flow.log_prob(x)

The package currently supports the following:

For examples of basic usage, see examples.

Installation

pip install flowjax

Warning

This package is new and may have substantial breaking changes between major releases.

TODO

A few limitations / things that could be worth including in the future:

  • Support embedding networks (for dimensionality reduction of conditioning variables)
  • Add batch/layer normalisation to neural networks
  • Training script for variational inference
  • Add documentation

Related

We make use of the Equinox package, which facilitates object-oriented programming with Jax.

Authors

flowjax was written by Daniel Ward <danielward27@outlook.com>.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowjax-5.0.0.tar.gz (16.4 kB view hashes)

Uploaded Source

Built Distribution

flowjax-5.0.0-py3-none-any.whl (20.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page