Skip to main content

Bayesian predictive classification and structure learning in decomposable graphical models using particle Gibbs.

Project description

GitHub PyPI Libraries.io dependency status for latest release

Bayesian inference in decomposable graphical models using sequential Monte Carlo methods

This library contains Bayesian inference in decomposable (triangulated) graphical models based on sequential Monte Carlo methods. Currently supported functionalities include:

  • Bayesian structure learning for discrete log-linear and Gaussian data.

  • Estimation of the number of decomopsable graphs with a given number of nodes.

  • Predictive classification using Bayesian model averaging (BMA).

  • Random generation of junction trees (the Christmas tree algorithm).

Installation

This package currently requires Python 2.7. If graphviz is not installed, you can install it from brew / aptitude / pacman for example

$ brew install graphviz

On Ubuntu you might need to run

sudo apt-get install python-dev graphviz libgraphviz-dev pkg-config

Then run

$ pip install trilearn

It is also possible to pull trilearn as a docker image by

$ docker pull onceltuca/trilearn

Running the tests

$ make test

Usage

See the Jupyter notebooks for examples of usage.

Scripts

Continuous data

To approximate the underlying decomposable graph posterior given the dataset sample_data/data_ar1-5.csv run

$ pgibbs_ggm_sample -N 50 -M 1000 -f sample_data/data_ar1-5.csv

this will produce a file containing the Markov chain generated by the particle Gibbs algorithm. In order to analyze the chain run

$ analyze_graph_tajectories

this will produce a bunch of files in the current directory to be analyzed.

Discrete data

The data set examples/data/czech_autoworkers.csv contains six binary variables. To generate a particle Gibbs trajectory of decomposable graphs type

$ pgibbs_loglinear_sample -N 50 -M 300 -f sample_data/czech_autoworkers.csv

and

$ analyze_graph_tajectories

this will produce a number of files in the current directory.

Estimate the number of decomposable graphs

To estimate the number of decomposable graphs with up to 15 nodes run for example

$ count_chordal_graphs -p 15 -N 20000

Built With

Authors

  • Felix L. Rios just send me an e-mail in case of any questions, felix.leopoldo.rios at gmail com

References

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details

Acknowledgments

  • Jim Holmstrom

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

trilearn-1.1-py3-none-any.whl (75.7 kB view details)

Uploaded Python 3

trilearn-1.1-py2-none-any.whl (75.7 kB view details)

Uploaded Python 2

File details

Details for the file trilearn-1.1-py3-none-any.whl.

File metadata

  • Download URL: trilearn-1.1-py3-none-any.whl
  • Upload date:
  • Size: 75.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/44.1.1 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/2.7.18

File hashes

Hashes for trilearn-1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0a5fe9743f96fd44f23180cc1c595ec76534dc2ba0b28ccad57ffae90c0d1443
MD5 9118962f7c066b525b1db9997cb34380
BLAKE2b-256 ad2d1412ef1f4fab1842683e0baf06686af9738f0c4a28aafa68ed636f23acbe

See more details on using hashes here.

File details

Details for the file trilearn-1.1-py2-none-any.whl.

File metadata

  • Download URL: trilearn-1.1-py2-none-any.whl
  • Upload date:
  • Size: 75.7 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/44.1.1 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/2.7.18

File hashes

Hashes for trilearn-1.1-py2-none-any.whl
Algorithm Hash digest
SHA256 4aa93e18a7754107dcca63aef707147997f68c536839bbe5ec0d2b143ca43f6c
MD5 9a9e1fb8b78287fe403177552ffa8ff7
BLAKE2b-256 31af4ef8952bcf6152561c06bfe62758a9e4abd2e65d922f101a42602d068bec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page