Skip to main content

Bayesian predictive classification and structure learning in decomposable graphical models using particle Gibbs.

Project description

![GitHub](https://img.shields.io/github/license/felixleopoldo/trilearn) ![PyPI](https://img.shields.io/pypi/v/trilearn) ![Libraries.io dependency status for latest release](https://img.shields.io/librariesio/release/pypi/trilearn)

# Bayesian inference in decomposable graphical models using sequential Monte Carlo methods This library contains Bayesian inference in decomposable (triangulated) graphical models based on sequential Monte Carlo methods. Currently supported functionalities include:

  • Bayesian structure learning for discrete log-linear and Gaussian data.

  • Estimation of the number of decomopsable graphs with a given number of nodes.

  • Predictive classification using Bayesian model averaging (BMA).

  • Random generation of junction trees (the Christmas tree algorithm).

### Installation

If graphviz is not installed, you can install it from brew / aptitude / pacman for example ` $ brew install graphviz ` Then run ` $ pip install trilearn ` This package currently requires Python 2.7 ### Running the tests

` $ make test ` ## Usage See the Jupyter notebooks for examples of usage.

## Scripts ### Continuous data To approximate the underlying decomposable graph posterior given the dataset sample_data/data_ar1-5.csv run ` $ pgibbs_ggm_sample -N 50 -M 1000 -f sample_data/data_ar1-5.csv ` this will produce a file containing the Markov chain generated by the particle Gibbs algorithm. In order to analyze the chain run ` $ analyze_graph_tajectories ` this will produce a bunch of files in the current directory to be analyzed.

### Discrete data The data set examples/data/czech_autoworkers.csv contains six binary variables. To generate a particle Gibbs trajectory of decomposable graphs type ` $ pgibbs_loglinear_sample -N 50 -M 300 -f sample_data/czech_autoworkers.csv ` and ` $ analyze_graph_tajectories ` this will produce a number of files in the current directory.

### Estimate the number of decomposable graphs To estimate the number of decomposable graphs with up to 15 nodes run for example ` $ count_chordal_graphs -p 15 -N 20000 ` ## Built With

## Authors

  • Felix L. Rios just send me an e-mail in case of any questions, felix.leopoldo.rios at gmail com

## References

## License

This project is licensed under the Apache 2.0 License - see the [LICENSE](LICENSE) file for details

## Acknowledgments

  • Jim Holmstrom

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trilearn-0.199-py2-none-any.whl (84.2 kB view details)

Uploaded Python 2

File details

Details for the file trilearn-0.199-py2-none-any.whl.

File metadata

  • Download URL: trilearn-0.199-py2-none-any.whl
  • Upload date:
  • Size: 84.2 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/2.7.10

File hashes

Hashes for trilearn-0.199-py2-none-any.whl
Algorithm Hash digest
SHA256 c22365d926c52333ec329ba4ced5de8f07074e082a9cc2338258df341df6548b
MD5 e310f8022a8ac13fd70ed65f4278d6f8
BLAKE2b-256 ce06dc4e9f5526193442db51f1c07c49bf7183a75c28b701acf54fdbe668de45

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page