Maximum entropy and minimum divergence models in Python
maxentropy: Maximum entropy and minimum divergence models in Python
This package helps you to construct a probability distribution (Bayesian prior) from prior information that you encode as generalized moment constraints.
You can use it to either:
find the flattest distribution that meets your constraints, using the maximum entropy principle (discrete distributions only)
or find the "closest" model to a given prior model (in a KL divergence sense) that also satisfies your additional constraints.
The maximum entropy principle has been shown [Cox 1982, Jaynes 2003] to be the unique consistent approach to constructing a discrete probability distribution from prior information that is available as "testable information".
If the constraints have the form of linear moment constraints, then the principle gives rise to a unique probability distribution of exponential form. Most well-known probability distributions are special cases of maximum entropy distributions. This includes uniform, geometric, exponential, Pareto, normal, von Mises, Cauchy, and others: see here.
Examples: constructing a prior subject to known constraints
See the notebooks folder.
This is a good place to start: Loaded die example (scikit-learn estimator API)
This package previously lived in SciPy
scipy.maxentropy from versions v0.5 to v0.10.
It was under-maintained and removed from SciPy v0.11. It has since been
resurrected and refactored to use the scikit-learn Estimator inteface.
(c) Ed Schofield, 2003-2019
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.