Skip to main content

Logarithmantic Monte Carlo

Project description

ascl:1706.005 PyPi LGPL-3.0

Logarithmantic Monte Carlo (LMC)

Python code for Markov Chain Monte Carlo

Logarithmancy (n): divination by means of algorithms

What is this?

LMC (not to be confused with the Large Magellanic Cloud) is a bundle of Python code for performing Markov Chain Monte Carlo, which implements a few different multidimensional proposal strategies and (optionally parallel) adaptation methods. There are similar packages out there, notably pymc - LMC exists because I found the alternatives to be too inflexible for the work I was doing at the time. On the off chance that someone else is in the same boat, here it is.

The samplers currently included are Metropolis, slice, and the affine-invariant sampler popularized by emcee (Goodman & Weare 2010).

An abridged description of the package (from the help function) is copied here:

The module should be very flexible, but is designed with these things foremost in mind:
 1. use with expensive likelihood calculations which probably have a host of hard-to-modify
    code associated with them.
 2. making it straightforward to break the parameter space into subspaces which can be sampled
    using different proposal methods and at different rates. For example, if changing some
    parameters requires very expensive calulations in the likelihood, the other, faster
    parameters can be sampled at a higher rate. Or, some parameters may lend themselves to
    Gibbs sampling, while others may not, and these can be block updated independently.
 3. keeping the overhead low to facilitate large numbers of parameters. Some of this has been
    lost in the port from C++, but, for example, the package provides automatic tuning of the
    proposal covariance for block updating without needing to store traces of the parameters in
    memory.

Real-valued parameters are usually assumed, but the framework can be used with other types of
parameters, with suitable overloading of classes.

A byproduct of item (1) is that the user is expected to handle all aspects of the calculation of
the posterior. The module doesn't implement assignment of canned, standard priors, or automatic
discovery of shortcuts like conjugate Gibbs sampling. The idea is that the user is in the best
position to know how the details of the likelihood and priors should be implemented.

Communication between parallel chains can significantly speed up convergence. In parallel mode,
adaptive Updaters use information from all running chains to tune their proposals, rather than
only from their own chain. The Gelman-Rubin convergence criterion (ratio of inter- to intra-chain
variances) for each free parameter is also calculated. Parallelization is implemented in two ways;
see ?Updater for instructions on using each.
 1. Via MPI (using mpi4py). MPI adaptations are synchronous: when a chain reaches a communication
    point, it stops until all chains have caught up.
 2. Via the filesystem. When a chain adapts, it will write its covariance information to a file. It
    will then read in any information from other chains that is present in similar files, and
    incorporate it when tuning. This process is asynchronous; chains will not wait for one another;
    they will simply adapt using whatever information has been shared at the time.

Installation

Automatic

Install from PyPI by running pip install lmc.

Manual

Download lmc/lmc.py and put it somewhere on your PYTHONPATH. You will need to have the numpy package installed. The mpi4py package is optional, but highly recommended.

Usage and Help

Documentation can be found throughout lmc.py, mostly in the form of docstrings, so it’s also available through the Python interpreter. There’s also a help() function (near the top of the file, if you’re browsing) and an example() function (near the bottom).

The examples can also be browsed here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lmc-0.2.2.tar.gz (26.5 kB view details)

Uploaded Source

Built Distribution

lmc-0.2.2-py3-none-any.whl (24.7 kB view details)

Uploaded Python 3

File details

Details for the file lmc-0.2.2.tar.gz.

File metadata

  • Download URL: lmc-0.2.2.tar.gz
  • Upload date:
  • Size: 26.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.8

File hashes

Hashes for lmc-0.2.2.tar.gz
Algorithm Hash digest
SHA256 51a3f2f059dcddfb354399892e2ba5f43019d3fd8031ec71e0c1bd70d03af196
MD5 99409a5c9da0c3f23f4943ee4eeac83d
BLAKE2b-256 0d839df9263570101762db33406a1e944478e46311745fe866016ab9bfb81687

See more details on using hashes here.

File details

Details for the file lmc-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: lmc-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 24.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.8

File hashes

Hashes for lmc-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8bcca7ae8b6f5396ea2a71cd300df079295895477539efca8666aeb7ff6a18a5
MD5 62943506e5fd89724fcd8e9bad98ad9f
BLAKE2b-256 2558f71454f901a80e56ccbf7393188afddc57db3ddbd691118d590364c80344

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page