Skip to main content

Hamiltonian sampling and analysis of sampled distributions

Project description

Package samppy implements Hamiltonian Markov-chain sampling and some additional analysis methods for multivariate probability distributions.

The probability distribution is represented only by an array of independent and identically distributed (i.i.d.) samples drawn from the distribution.

The package includes three modules:

  • Module hamiltonian_sampler implements Hamiltonian Markov-Chain sampling. A HamiltonianSampler instance can generate random samples of a multivariate probability distribution, defined only by an non-normalized LOG-LIKELIHOOD function, and the GRADIENT of that function.

    The generated batch of sample vectors is stored in a 2D numpy array, either as rows or as columns.

    Class HamiltonianSampler defines a standard isotropic sampler. Class HamiltonianBoundedSampler is a subclass also allowing one- or two-sided interval limits for all vector elements.

  • Module credibility includes functions to estimate jointly credible differences and/or correlations between pairs of elements in a random vector with a multivariate probability distribution, represented only by samples.

  • Module sample_entropy includes a function to estimate the differential entropy of a multivariate probability distribution, represented only by samples. The entropy is estimated by the Kozachenko-Leonenko nearest-neighbor approximation (Singh and Poczos, 2016).


This package was developed mainly for use by another project. It is distributed separately because it may be useful for other purposes.

The Hamiltonian sampler code was inspired by the LAHMC project by Jasha Sohlstein. The present implementation includes some safety features to facilitate its use. The present sampler does not include sampling within a general subspace manifold.


The package requires Python 3.6 with Numpy and Scipy installed. It needs some features of Numpy v1.17. It has been tested with Scipy 1.13.


R M Neal (2011): MCMC using Hamiltonian dynamics. Ch. 5 in Brooks et al. (eds) Handbook of Markov Chain Monte Carlo. Chapman and Hall / CRC Press.

A. Leijon, G. E. Henter, and M. Dahlquist (2016): Bayesian analysis of phoneme confusion matrices. IEEE Transactions on Audio, Speech, and Language Processing 24(3):469–482. (Describes an application of credible-difference calculation.)

F Perez-Cruz (2008): Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems 21 (NIPS 2008).

S Singh and B Poczos (2016): Analysis of k-nearest neighbor distances with application to entropy estimation. arXiv:1603.08578 [math.ST].

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for samppy, version 1.1.0
Filename, size File type Python version Upload date Hashes
Filename, size samppy-1.1.0-py3-none-any.whl (18.5 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size samppy-1.1.0.tar.gz (15.6 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page