Hamiltonian sampling and analysis of sampled distributions
Package samppy implements Hamiltonian Markov-chain sampling and some additional analysis methods for multivariate probability distributions.
The probability distribution is represented only by an array of independent and identically distributed (i.i.d.) samples drawn from the distribution.
The package includes three modules:
Module hamiltonian_sampler implements Hamiltonian Markov-Chain sampling. A HamiltonianSampler instance can generate random samples of a multivariate probability distribution, defined only by an non-normalized log-likelihood function, and the gradient of that function.
The generated batch of sample vectors is stored in a 2D numpy array, either as rows or as columns.
HamiltonianSamplerdefines a standard isotropic sampler. Class
HamiltonianBoundedSampleris a subclass also allowing one- or two-sided interval limits for all vector elements.
Module credibility includes functions to estimate jointly credible differences and/or correlations between pairs of elements in a random vector with a multivariate probability distribution, represented only by samples.
Module sample_entropy includes a function to estimate the differential entropy of a multivariate probability distribution, represented only by samples. The entropy is estimated by the Kozachenko-Leonenko nearest-neighbor approximation (Singh and Poczos, 2016).
This package was developed mainly for use by another project. It is distributed separately because it may be useful for other purposes.
The Hamiltonian sampler code was inspired by the LAHMC project by Jasha Sohlstein. The present implementation includes some safety features to facilitate its use. The present sampler does not include sampling within a general subspace manifold.
The package requires Python 3.6 with Numpy and Scipy installed. It has been tested with Numpy v. 1.17 and Scipy v. 1.5.4.
R M Neal (2011): MCMC using Hamiltonian dynamics. Ch. 5 in Brooks et al. (eds) Handbook of Markov Chain Monte Carlo. Chapman and Hall / CRC Press.
A. Leijon, G. E. Henter, and M. Dahlquist (2016): Bayesian analysis of phoneme confusion matrices. IEEE Transactions on Audio, Speech, and Language Processing 24(3):469–482. (Describes an application of credible-difference calculation.)
F Perez-Cruz (2008): Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems 21 (NIPS 2008).
S Singh and B Poczos (2016): Analysis of k-nearest neighbor distances with application to entropy estimation. arXiv:1603.08578 [math.ST].