Skip to main content

Fit and compare complex models reliably and rapidly. Advanced Nested Sampling.

Project description

UltraNest

Fit and compare complex models reliably and rapidly with advanced sampling techniques.

https://img.shields.io/pypi/v/ultranest.svg https://circleci.com/gh/JohannesBuchner/UltraNest/tree/master.svg?style=shield Documentation Status

Correctness. Speed. Ease of use. 🦔

About

When scientific models are compared to data, two tasks are important: 1) contraining the model parameters and 2) comparing the model to other models. Different techniques have been developed to explore model parameter spaces. This package implements a Monte Carlo technique called nested sampling.

Nested sampling allows Bayesian inference on arbitrary user-defined likelihoods. In particular, posterior probability distributions on model parameters are constructed, and the marginal likelihood (“evidence”) Z is computed. The former can be used to describe the parameter constraints of the data, the latter can be used for model comparison (via Bayes factors) as a measure of the prediction parsimony of a model.

In the last decade, multiple variants of nested sampling have been developed. These differ in how nested sampling finds better and better fits while respecting the priors (constrained likelihood prior sampling techniques), and whether it is allowed to go back to worse fits and explore the parameter space more.

This package develops novel, advanced techniques for both (See How it works). They are especially remarkable for being free of tuning parameters and theoretically justified. Beyond that, UltraNest has support for Big Data sets and high-performance computing applications.

UltraNest is intended for fitting complex physical models with slow likelihood evaluations, with one to hundreds of parameters. UltraNest intends to replace heuristic methods like multi-ellipsoid nested sampling and dynamic nested sampling with more rigorous methods. UltraNest also attempts to provide feature parity compared to other packages (such as MultiNest).

You can help by testing UltraNest and reporting issues. Code contributions are welcome. See the Contributing page.

Features

  • Pythonic

    • pip installable

    • Easy to program for: Sanity checks with meaningful errors

    • Can control the run programmatically and check status

    • Reasonable defaults, but customizable

    • Thoroughly tested with many unit and integration tests

  • Robust exploration easily handles:

    • Degenerate parameter spaces such as bananas or tight correlations

    • Multiple modes/solutions in the parameter space

    • Robust, parameter-free MLFriends algorithm (metric learning RadFriends, Buchner+14,+19), with new improvements (region follows new live points, clustering improves metric iteratively).

    • High-dimensional problems with slice sampling (or ellipsoidal sampling, FlatNUTS, etc.), inside region.

    • Wrapped/circular parameters, derived parameters

    • Fast-slow parameters

  • strategic nested sampling

    • can vary (increase) number of live points (akin to dynamic nested sampling, but with different targets)

    • can sample clusters optimally (e.g., at least 50 points per cluster/mode/solution)

    • can target minimizing parameter estimation uncertainties

    • can target a desired evidence uncertainty threshold

    • can target a desired number of effective samples

    • or any combination of the above

    • Robust ln(Z) uncertainties by bootstrapping live points.

  • Lightweight and fast

    • some functions implemented in Cython

    • vectorized likelihood function calls

    • Use multiple cores, fully parallelizable from laptops to clusters

    • MPI support

  • Advanced visualisation and crash recovery:

    • Checkpointing and resuming, even with different number of live points

    • Run-time visualisations and exploration information

    • Corner plots, run and parameter exploration diagnostic plots

TODO

  • Documentation:

    • Example power law fit

    • Example spectral line fit, white and GP

    • Example low-d Bayesian GP emulator as pre-filter to model evaluation

    • Example verifying integration with VB+IS

Usage

Read the full documentation at:

https://johannesbuchner.github.io/UltraNest/

Licence

GPLv3 (see LICENCE file). If you require another license, please contact me.

Icon made by Freepik.

Release Notes

2.0.0 (2019-10-03)

  • First release.

1.0.0 (2014)

  • A simpler version referenced in Buchner et al. (2014), combining RadFriends with an optional Metropolis-Hastings proposal.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ultranest-2.2.2.tar.gz (21.2 MB view details)

Uploaded Source

File details

Details for the file ultranest-2.2.2.tar.gz.

File metadata

  • Download URL: ultranest-2.2.2.tar.gz
  • Upload date:
  • Size: 21.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.9

File hashes

Hashes for ultranest-2.2.2.tar.gz
Algorithm Hash digest
SHA256 9a10fe329cdde398ca1f1f0b99cbba3133da26248a3823390d3b4eb2d4dc609b
MD5 2feba3cf2e8814bde28e302663bba06b
BLAKE2b-256 d881abb2124484206d25d5c12f544b6bdf6df457f01ffc26222bbeebee623b38

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page