Hamiltonian sampling and analysis of sampled distributions

## Project description

Package samppy implements Hamiltonian Markov-chain sampling and some additional analysis methods for multivariate probability distributions.

The probability distribution is represented only by an array of independent and identically distributed (i.i.d.) samples drawn from the distribution, with unifom probability mass for all samples.

The package includes three modules.

Module hamiltonian_sampler:

This module implements Hamiltonian Markov-Chain sampling.

A HamiltonianSampler instance can generate random samples of a multivariate probability distribution, defined only by an un-normalized LOG-LIKELIHOOD function, and the GRADIENT of that function.

The generated batch of sample vectors is stored in a 2D numpy array, either as ROWS or as COLUMNS. This is controlled by the module-global variable VECTOR_AXIS.

Class HamiltonianSampler defines a standard isotropic sampler. Class HamiltonianBoundedSampler is a subclass also allowing one- or two-sided interval limits for all vector elements.

Module credibility:

This module includes functions to estimate jointly credible differences and/or correlations between pairs of elements in a random vector with a multivariate probability distribution, represented only by samples.

Module sample_entropy:

This module includes a function to estimate the differential entropy of a multivariate probability distribution, represented only by samples. The entropy is estimated by the Kozachenko-Leonenko nearest-neighbor approximation (Singh and Poczos, 2016).

##Usage

This package was developed mainly for use in another project, PairedCompCalc. It is distributed separately because it may be useful for other purposes.

The Hamiltonian sampler code was inspired by the LAHMC project by Jasha Sohlstein. The present implementation includes some safety features to facilitate its use. The present sampler does not include sampling within a general subspace manifold.

##Requirements

The package requires Python 3.6 with Numpy and Scipy installed. It has been tested with Numpy 1.13 and 1.15, and Scipy 1.0.0 and 1.1.0.

##References

R M Neal (2011): MCMC using Hamiltonian dynamics. Ch. 5 in Brooks et al. (eds) Handbook of Markov Chain Monte Carlo. Chapman and Hall / CRC Press.

A. Leijon, G. E. Henter, and M. Dahlquist (2016): Bayesian analysis of phoneme confusion matrices. IEEE Transactions on Audio, Speech, and Language Processing 24(3):469–482. (Describes an application of credible-difference calculation.)

F Perez-Cruz (2008): Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems 21 (NIPS 2008).

S Singh and B Poczos (2016): Analysis of k-nearest neighbor distances with application to entropy estimation. arXiv:1603.08578 [math.ST].

## Project details

Uploaded source
Uploaded py3