Skip to main content

Bayesian predictive classification and structure learning in decomposable graphical models using particle Gibbs.

Project description

GitHub PyPI Libraries.io dependency status for latest release

Bayesian inference in decomposable graphical models using sequential Monte Carlo methods

This library contains Bayesian inference in decomposable (triangulated) graphical models based on sequential Monte Carlo methods. Currently supported functionalities include:

  • Bayesian structure learning for discrete log-linear and Gaussian data.

  • Estimation of the number of decomopsable graphs with a given number of nodes.

  • Predictive classification using Bayesian model averaging (BMA).

  • Random generation of junction trees (the Christmas tree algorithm).

Installation

This package currently requires Python 2.7. If graphviz is not installed, you can install it from brew / aptitude / pacman for example

$ brew install graphviz

On Ubuntu you might need to run

sudo apt-get install python-dev graphviz libgraphviz-dev pkg-config

Then run

$ pip install trilearn

It is also possible to pull trilearn as a docker image by

$ docker pull onceltuca/trilearn

Running the tests

$ make test

Usage

See the Jupyter notebooks for examples of usage.

Scripts

Continuous data

To approximate the underlying decomposable graph posterior given the dataset sample_data/data_ar1-5.csv run

$ pgibbs_ggm_sample -N 50 -M 1000 -f sample_data/data_ar1-5.csv -o results_ggm

this will produce a file containing the Markov chain generated by the particle Gibbs algorithm. In order to analyze the chain run

$ analyze_graph_tajectories -i results_ggm -o results_ggm/plots

this will produce a bunch of files in the current directory to be analyzed.

Discrete data

The data set examples/data/czech_autoworkers.csv contains six binary variables. To generate a particle Gibbs trajectory of decomposable graphs type

$ pgibbs_loglinear_sample -N 50 -M 300 -f sample_data/czech_autoworkers.csv  -o results_loglin

and

$ analyze_graph_tajectories -i results_loglin -o results_loglin/plots

this will produce a number of files in the current directory.

Estimate the number of decomposable graphs

To estimate the number of decomposable graphs with up to 15 nodes run for example

$ count_chordal_graphs -p 15 -N 20000

Built With

Authors

  • Felix L. Rios just send me an e-mail in case of any questions, felix.leopoldo.rios at gmail com

References

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details

Acknowledgments

  • Jim Holmstrom

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trilearn-1.25-py2-none-any.whl (78.8 kB view details)

Uploaded Python 2

File details

Details for the file trilearn-1.25-py2-none-any.whl.

File metadata

  • Download URL: trilearn-1.25-py2-none-any.whl
  • Upload date:
  • Size: 78.8 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.3 requests/2.27.1 setuptools/44.0.0.post20200106 requests-toolbelt/0.9.1 tqdm/4.64.1 CPython/2.7.18

File hashes

Hashes for trilearn-1.25-py2-none-any.whl
Algorithm Hash digest
SHA256 9adf56f81b42bd9296fb8ab11ee2ecdee91670114b17bf9975e0d55b0a40597c
MD5 3166ec0b4a3518f0c7818397c3ac24a4
BLAKE2b-256 a1c6491e8d8a09c51bac3801fd13b587b3eae4b605d18584bacf486cb27cd021

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page