Didactic Gaussian processes in Jax.
Project description
Quickstart | Install guide | Documentation | Slack Community
GPJax aims to provide a low-level interface to Gaussian process (GP) models in Jax, structured to give researchers maximum flexibility in extending the code to suit their own needs. The idea is that the code should be as close as possible to the maths we write on paper when working with GP models.
Package support
GPJax was founded by Thomas Pinder. Today, the maintenance of GPJax is undertaken by Thomas Pinder and Daniel Dodd.
We would be delighted to receive contributions from interested individuals and groups. To learn how you can get involved, please read our guide for contributing. If you have any questions, we encourage you to open an issue. For broader conversations, such as best GP fitting practices or questions about the mathematics of GPs, we invite you to open a discussion.
Feel free to join our Slack Channel, where we can discuss the development of GPJax and broader support for Gaussian process modelling.
Supported methods and interfaces
Notebook examples
Guides for customisation
Conversion between .ipynb
and .py
Above examples are stored in examples directory in the double percent (py:percent
) format. Checkout jupytext using-cli for more info.
- To convert
example.py
toexample.ipynb
, run:
jupytext --to notebook example.py
- To convert
example.ipynb
toexample.py
, run:
jupytext --to py:percent example.ipynb
Simple example
Let us import some dependencies and simulate a toy dataset $\mathcal{D}$.
import gpjax as gpx
from jax import grad, jit
import jax.numpy as jnp
import jax.random as jr
import jaxkern as jk
import optax as ox
key = jr.PRNGKey(123)
f = lambda x: 10 * jnp.sin(x)
n = 50
x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,1)).sort()
y = f(x) + jr.normal(key, shape=(n,1))
D = gpx.Dataset(X=x, y=y)
The function of interest here, $f(\cdot)$, is sinusoidal, but our observations of it have been perturbed by Gaussian noise. We aim to utilise a Gaussian process to try and recover this latent function.
1. Constructing the prior and posterior
We begin by defining a zero-mean Gaussian process prior with a radial basis function kernel and assume the likelihood to be Gaussian.
prior = gpx.Prior(kernel = jk.RBF())
likelihood = gpx.Gaussian(num_datapoints = n)
Similar to how we would write on paper, the posterior is constructed by the product of our prior with our likelihood.
posterior = prior * likelihood
2. Learning hyperparameters
Equipped with the posterior, we seek to learn the model's hyperparameters through gradient-optimisation of the marginal log-likelihood. We this below, adding Jax's just-in-time (JIT) compilation to accelerate training.
mll = jit(posterior.marginal_log_likelihood(D, negative=True))
For purposes of optimisation, we'll use optax's Adam.
opt = ox.adam(learning_rate=1e-3)
We define an initial parameter state through the initialise
callable.
parameter_state = gpx.initialise(posterior, key=key)
Finally, we run an optimisation loop using the Adam optimiser via the fit
callable.
inference_state = gpx.fit(mll, parameter_state, opt, num_iters=500)
3. Making predictions
Using our learned hyperparameters, we can obtain the posterior distribution of the latent function at novel test points.
learned_params, _ = inference_state.unpack()
xtest = jnp.linspace(-3., 3., 100).reshape(-1, 1)
latent_distribution = posterior(learned_params, D)(xtest)
predictive_distribution = likelihood(learned_params, latent_distribution)
predictive_mean = predictive_distribution.mean()
predictive_cov = predictive_distribution.covariance()
Installation
Stable version
The latest stable version of GPJax can be installed via pip
:
pip install gpjax
Note
We recommend you check your installation version:
python -c 'import gpjax; print(gpjax.__version__)'
Development version
Warning
This version is possibly unstable and may contain bugs.
Clone a copy of the repository to your local machine and run the setup configuration in development mode.
git clone https://github.com/JaxGaussianProcesses/GPJax.git
cd GPJax
python setup.py develop
Note
We advise you create virtual environment before installing:
conda create -n gpjax_experimental python=3.10.0 conda activate gpjax_experimental
and recommend you check your installation passes the supplied unit tests:
python -m pytest tests/
Citing GPJax
If you use GPJax in your research, please cite our JOSS paper.
@article{Pinder2022,
doi = {10.21105/joss.04455},
url = {https://doi.org/10.21105/joss.04455},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {75},
pages = {4455},
author = {Thomas Pinder and Daniel Dodd},
title = {GPJax: A Gaussian Process Framework in JAX},
journal = {Journal of Open Source Software}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file gpjax-nightly-0.5.9.dev20230204.tar.gz
.
File metadata
- Download URL: gpjax-nightly-0.5.9.dev20230204.tar.gz
- Upload date:
- Size: 60.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b700d7246de04a7dba7fccbe0025f5c9d3bc783bc0699d98270dfe8ee00ee946 |
|
MD5 | 2f2877c20f2cf8c05d2a013618bca0e3 |
|
BLAKE2b-256 | 3efe5f431336e79ffc76d8c9bc4972dbb017ddc43f91c93b3590af3866208549 |
File details
Details for the file gpjax_nightly-0.5.9.dev20230204-py3-none-any.whl
.
File metadata
- Download URL: gpjax_nightly-0.5.9.dev20230204-py3-none-any.whl
- Upload date:
- Size: 48.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 84f22c92481281161f84b1475a45c05c9afd5e2cd501c211bc33a262ec83ab81 |
|
MD5 | 989d4622e548d22712cad351059f3725 |
|
BLAKE2b-256 | e4d6b0cb07f310276cd7dce9dc312da15409b43e46c8fd9ac185cc3ef594eaa2 |