Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (pypi.python.org).
Help us improve Python packaging - Donate today!

A library of scalable Bayesian generalised linear models with fancy features

Project Description

A library of scalable Bayesian generalised linear models with fancy features

This library implements various Bayesian linear models (Bayesian linear regression) and generalised linear models. A few features of this library are:

  • A fancy basis functions/feature composition framework for combining basis functions like radial basis function, sigmoidal basis functions, polynomial basis functions etc.
  • Basis functions that can be used to approximate Gaussian processes with shift invariant covariance functions (e.g. square exponential) when used with linear models [1], [2], [3].
  • Non-Gaussian likelihoods with Bayesian generalised linear models using a modified version of the nonparametric variational inference algorithm presented in [4].
  • Large scale learning using stochastic gradient descent (ADADELTA).

Quickstart

To install, simply run setup.py:

$ python setup.py install

or install with pip:

$ pip install git+https://github.com/nicta/revrand.git@release

Refer to docs/installation.rst for advanced installation instructions.

Have a look at some of the demos, e.g.:

$ python demos/demo_regression.py

Or,

$ python demos/demo_glm.py

Bayesian Linear Regression Example

Here is a very quick example of how to use Bayesian linear regression with SGD optimisation of the likelihood noise, regulariser and basis function parameters. Assuming we already have training noisy targets y, inputs X, and some query inputs Xs (as well as the true noiseless function f):

import matplotlib.pyplot as pl
import numpy as np
from revrand.basis_functions import LinearBasis, RandomRBF
from revrand.regression import learn_sgd, predict

...

# Concatenate a linear basis and a Random radial basis (GP approx)
basis = LinearBasis(onescol=True) + RandomRBF(nbases=300, Xdim=X.shape[1])
init_lenscale = 1.0

# Learn regression parameters and predict
params = learn_sgd(X, y, basis, [init_lenscale])
Eys, Vfs, Vys = predict(Xs, basis, *params)

# Training/Truth
pl.plot(X, y, 'k.', label='Training')
pl.plot(Xs, f, 'k-', label='Truth')

# SGD Regressor
Sys = np.sqrt(Vys)
pl.plot(Xs, Eys, 'r-', label='SGD Bayes linear reg.')
pl.fill_between(Xs, Eys - 2 * Sys, Eys + 2 * Sys, facecolor='none',
                edgecolor='r', linestyle='--', label=None)

pl.legend()

pl.grid(True)
pl.title('Regression demo')
pl.ylabel('y')
pl.xlabel('x')
pl.show()

This script will output something like the following,

Bayesian Generalised Linear Model Example

This example is very similar to that above, but now let’s assume our targets y are drawn from a Poisson likelihood, or observation, distribution which is a function of the inputs, X. The task here is to predict the mean of the Poisson distribution for query inputs Xs, as well as the uncertainty associated with the prediction.

import matplotlib.pyplot as pl
import numpy as np
from revrand.basis_functions import RandomRBF
from revrand.glm import learn, predict_meanvar, predict_interval

...

# Random radial basis (GP approx)
basis = RandomRBF(nbases=100, Xdim=X.shape[1])
init_lenscale = 1.0

# Set up the likelihood of the GLM
llhood = likelihoods.Poisson(tranfcn='exp')  # log link

# Learn regression parameters and predict
params = learn(X, y, llhood, [], basis, [init_lenscale])
Eys, _, _, _ = predict_meanvar(Xs, llhood, basis, *params)
y95n, y95x = predict_interval(0.95, Xs, llhood, basis, *params)

# Training/Truth
pl.plot(X, y, 'k.', label='Training')
pl.plot(Xs, f, 'k-', label='Truth')

# GLM SGD Regressor
pl.plot(Xs, Eys, 'b-', label='GLM mean.')
pl.fill_between(Xs, y95n, y95x, facecolor='none',
                edgecolor='b', linestyle='--', label=None)

pl.legend()

pl.grid(True)
pl.title('Regression demo')
pl.ylabel('y')
pl.xlabel('x')
pl.show()

This script will output something like the following,

Bugs & Feedback

For bugs, questions and discussions, please use Github Issues.

References

[1]Yang, Z., Smola, A. J., Song, L., & Wilson, A. G. “A la Carte – Learning Fast Kernels”. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, pp. 1098-1106, 2015.
[2]Le, Q., Sarlos, T., & Smola, A. “Fastfood-approximating kernel expansions in loglinear time.” Proceedings of the international conference on machine learning. 2013.
[3]Rahimi, A., & Recht, B. “Random features for large-scale kernel machines.” Advances in neural information processing systems. 2007.
[4]Gershman, S., Hoffman, M., & Blei, D. “Nonparametric variational inference”. arXiv preprint arXiv:1206.4665 (2012).
Release History

Release History

History Node

1.0.0

History Node

0.9.10

History Node

0.9.9

History Node

0.9.0

History Node

0.7.3

History Node

0.7.2

History Node

0.7.1

History Node

0.7.0

History Node

0.6.5

History Node

0.6.4

History Node

0.6.3

History Node

0.6.2

History Node

0.6.1

History Node

0.6.0

History Node

0.5.1

History Node

0.5.0

History Node

0.4.2

History Node

0.4.1

History Node

0.4.0

History Node

0.3.2

History Node

0.3.1

History Node

0.3

History Node

0.2rc2

This version
History Node

0.1rc1

Download Files

Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
revrand-0.1rc1.tar.gz (41.7 kB) Copy SHA256 Checksum SHA256 Source Dec 16, 2015

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting