Skip to main content

A library of scalable Bayesian generalised linear models with fancy features

Project description

https://travis-ci.org/NICTA/revrand.svg?branch=master https://codecov.io/github/NICTA/revrand/coverage.svg?branch=master

A library of scalable Bayesian generalised linear models with fancy features

This library implements various Bayesian linear models (Bayesian linear regression) and generalised linear models. A few features of this library are:

  • A fancy basis functions/feature composition framework for combining basis functions like radial basis function, sigmoidal basis functions, polynomial basis functions etc.

  • Basis functions that can be used to approximate Gaussian processes with shift invariant covariance functions (e.g. square exponential) when used with linear models [1], [2], [3].

  • Non-Gaussian likelihoods with Bayesian generalised linear models using a modified version of the nonparametric variational inference algorithm presented in [4].

  • Large scale learning using stochastic gradient descent (ADADELTA).

Quickstart

To install, simply run setup.py:

$ python setup.py install

or install with pip:

$ pip install git+https://github.com/nicta/revrand.git

Refer to docs/installation.rst for advanced installation instructions.

Have a look at some of the demos, e.g.:

$ python demos/demo_regression.py

Or,

$ python demos/demo_glm.py

Bayesian Linear Regression Example

Here is a very quick example of how to use Bayesian linear regression with optimisation of the likelihood noise, regulariser and basis function parameters. Assuming we already have training noisy targets y, inputs X, and some query inputs Xs (as well as the true noiseless function f):

import matplotlib.pyplot as pl
import numpy as np
from revrand.basis_functions import LinearBasis, RandomRBF
from revrand.regression import learn, predict
from revrand.btypes import Parameter, Positive

...

# Concatenate a linear basis and a Random radial basis (GP approx)
init_lenscale = Parameter(1.0, Positive())  # init val and bounds
basis = LinearBasis(onescol=True) \
    + RandomRBF(nbases=300, Xdim=X.shape[1], init_lenscale)

# Learn regression parameters and predict
params = regression.learn(X, y, basis)
Eys, Vfs, Vys = regression.predict(Xs, basis, *params)

# Training/Truth
pl.plot(X, y, 'k.', label='Training')
pl.plot(Xs, f, 'k-', label='Truth')

# Plot Regressor
Sys = np.sqrt(Vys)
pl.plot(Xs, Eys, 'g-', label='Bayesian linear regression')
pl.fill_between(Xs, Eys - 2 * Sys, Eys + 2 * Sys, facecolor='none',
                edgecolor='g', linestyle='--', label=None)

pl.legend()

pl.grid(True)
pl.title('Regression demo')
pl.ylabel('y')
pl.xlabel('x')
pl.show()

This script will output something like the following,

blr_demo.png

Bayesian Generalised Linear Model Example

This example is very similar to that above, but now let’s assume our targets y are drawn from a Poisson likelihood, or observation, distribution which is a function of the inputs, X. The task here is to predict the mean of the Poisson distribution for query inputs Xs, as well as the uncertainty associated with the prediction.

import matplotlib.pyplot as pl
import numpy as np
from revrand.basis_functions import RandomRBF
from revrand.glm import learn, predict_moments, predict_interval

...

# Random radial basis (GP approx)
init_lenscale = Parameter(1.0, Positive())  # init val and bounds
basis = RandomRBF(nbases=100, Xdim=X.shape[1], init_lenscale)

# Set up the likelihood of the GLM
llhood = likelihoods.Poisson(tranfcn='exp')  # log link

# Learn regression parameters and predict
params = learn(X, y, llhood, basis)
Eys, _, _, _ = predict_moments(Xs, llhood, basis, *params)
y95n, y95x = predict_interval(0.95, Xs, llhood, basis, *params)

# Training/Truth
pl.plot(X, y, 'k.', label='Training')
pl.plot(Xs, f, 'k-', label='Truth')

# Plot GLM SGD Regressor
pl.plot(Xs, Eys, 'b-', label='GLM mean.')
pl.fill_between(Xs, y95n, y95x, facecolor='none',
                edgecolor='b', linestyle='--', label=None)

pl.legend()

pl.grid(True)
pl.title('Regression demo')
pl.ylabel('y')
pl.xlabel('x')
pl.show()

This script will output something like the following,

glm_demo.png

Large-scale Learning with Stochastic Gradients

By default the GLM uses stochastic gradients to learn all of its parameters/hyperparameters and does not require any matrix inversion, and so it can be used to learn from large datasets with lots of features (regression.learn uses L-BFGS and requires a matrix inversion). We can also use the GLM to approximate and scale up regular Bayesian linear regression. For instance, if we modify the Bayesian linear regression example from before,

...

from revrand import glm, likelihoods

...

# Set up the likelihood of the GLM
llhood = likelihoods.Gaussian(var_init=Parameter(1., Positive()))

# Learn regression parameters and predict
params = glm.learn(X, y, llhood, basis)
Ey_g, Vf_g, Eyn, Eyx = glm.predict_moments(Xtest, llhood, base, *params)

...

# Plot GLM SGD Regressor
Sy_g = np.sqrt(Vy_g)
pl.plot(Xpl_s, Ey_g, 'm-', label='GLM')
pl.fill_between(Xs, Ey_g - 2 * Sy_g, Ey_g + 2 * Sy_g, facecolor='none',
                edgecolor='m', linestyle='--', label=None)

...

This script will output something like the following,

glm_sgd_demo.png

We can see the approximation from the GLM is pretty good - this is because it uses a mixture of diagonal Gaussians posterior (thereby avoiding a full matrix inversion) to approximate the full Gaussian posterior covariance over the weights. This also has the advantage of allowing the model to learn multi-modal posterior distributions when non-Gaussian likelihoods are required.

Feature Composition Framework

We have implemented an easy to use and extensible feature-building framework within revrand. You have already seen the basics demonstrated in the above examples, i.e. concatenation of basis functions,

>>> X = np.random.randn(100, 5)
>>> N, d = X.shape
>>> base = LinearBasis(onescol=True) + RandomRBF(Xdim=d, nbases=100)
>>> lenscale = 1.
>>> Phi = base(X, lenscale)
>>> Phi.shape
(100, 206)

There are a few things at work in this example:

  • Both LinearBasis and RandomRBF are applied to all of X, and the result is concatenated.

  • LinearBasis has pre-pended a column of ones onto X so a subsequent algorithm can learn a “bias” term.

  • RandomRBF is actually approximating a radial basis kernel function, [3], so we can approximate how a kernel machine functions with a basis function! This also outputs 2 * nbases number of basis functions.

  • Hence the resulting basis function has a shape of (N, d + 1 + 2 * nbases).

We can also use partial application of basis functions, e.g.

>>> base = LinearBasis(onescol=True, apply_ind=slice(0, 2)) \
    + RandomRBF(Xdim=d, nbases=100, apply_ind=slice(2, 5))
>>> Phi = base(X, lenscale)
>>> Phi.shape
(100, 203)

Now the basis functions are applied to seperate dimensions of the input, X. That is, LinearBasis takes dimensions 0 and 1, and RandomRBF takes the rest, and again the results are concatenated.

Finally, if we use these basis functions with any of the algorithms in this revrand, the parameters of the basis functions are learned as well! So really in the above example lenscale = 1. is just an initial value for the kernel function length-scale!

Bugs & Feedback

For bugs, questions and discussions, please use Github Issues.

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

revrand-0.3.tar.gz (42.5 kB view details)

Uploaded Source

File details

Details for the file revrand-0.3.tar.gz.

File metadata

  • Download URL: revrand-0.3.tar.gz
  • Upload date:
  • Size: 42.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for revrand-0.3.tar.gz
Algorithm Hash digest
SHA256 92eb76f52b7d0bbcab75ec3cbff5025dd5fff2678ce68b4ede3f702124e479da
MD5 ae82745fdefe24937c900e2faebdf540
BLAKE2b-256 adc1f97466f2d30211fb03ff92af9f687ba83637dc761fa2aac36dd04723a9e5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page