Skip to main content

A Python Library for Deep Probabilistic Modeling

Project description

MIT license PyPI version

Logo

Abstract

DeeProb-kit is a general-purpose Python library providing a collection of deep probabilistic models (DPMs) which are easy to use and extend. It also includes efficiently implemented learning techniques, inference routines and statistical algorithms. The availability of a representative selection of the most common DPMs in a single library makes it possible to combine them in a straightforward manner, a common practice in deep learning research nowadays, which however is still missing for certain class of models. Moreover, DeeProb-kit provides high-quality fully-documented APIs, and it will help the community to accelerate research on DPMs as well as improve experiments' reproducibility.

Features

  • Inference algorithms for SPNs. 1 4
  • Learning algorithms for SPNs structure. 1 2 3 4 5
  • Chow-Liu Trees (CLT) as SPN leaves. 12 13
  • Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. 14 15
  • Structural marginalization and pruning algorithms for SPNs.
  • High-order moments computation for SPNs.
  • JSON I/O operations for SPNs and CLTs. 4
  • Plotting operations based on NetworkX for SPNs and CLTs. 4
  • Randomized And Tensorized SPNs (RAT-SPNs). 6
  • Deep Generalized Convolutional SPNs (DGC-SPNs). 11
  • Masked Autoregressive Flows (MAFs). 7
  • Real Non-Volume-Preserving (RealNVP) flows. 8
  • Non-linear Independent Component Estimation (NICE) flows. 9

The collection of implemented models is summarized in the following table. The supported data dimensionality for each model is showed in the Input Dimensionality column. Moreover, the Supervised column tells which model is suitable for a supervised learning task, other than density estimation task.

Legend — D: one-dimensional size, C: channels, H: height, W: width.

Model Description Input Dimensionality Supervised
Binary-CLT Binary Chow-Liu Tree (CLT) D
SPN Vanilla Sum-Product Network D
MSPN Mixed Sum-Product Network D
XPC Random Probabilistic Circuit D
RAT-SPN Randomized and Tensorized Sum-Product Network D
DGC-SPN Deep Generalized Convolutional Sum-Product Network (C, D, D)
MAF Masked Autoregressive Flow D
NICE Non-linear Independent Components Estimation Flow D and (C, H, W)
RealNVP Real-valued Non-Volume-Preserving Flow D and (C, H, W)

Installation

The library can be installed either from PIP repository or by source code.

# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/deeprob-kit.git@main#egg=deeprob-kit

Project Directories

The documentation is generated automatically by Sphinx using sources stored in the docs directory.

A collection of code examples and experiments can be found in the examples and experiments directories respectively. Moreover, benchmark code can be found in the benchmark directory.

Related Repositories

References

1. Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015).

2. Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011).

3. Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018).

4. Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019).

5. Di Mauro et al. Sum-Product Network structure learning by efficient product nodes discovery. AIxIA (2018).

6. Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020).

7. Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017).

8. Dinh et al. Density Estimation using RealNVP. ICLR (2017).

9. Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015).

10. Papamakarios, Nalisnick et al. Normalizing Flows for Probabilistic Modeling and Inference. JMLR (2021).

11. Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020).

12. Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014).

13. Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021).

14. Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016).

15. Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeprob-kit-1.1.0.tar.gz (129.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeprob_kit-1.1.0-py3-none-any.whl (111.9 kB view details)

Uploaded Python 3

File details

Details for the file deeprob-kit-1.1.0.tar.gz.

File metadata

  • Download URL: deeprob-kit-1.1.0.tar.gz
  • Upload date:
  • Size: 129.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.9

File hashes

Hashes for deeprob-kit-1.1.0.tar.gz
Algorithm Hash digest
SHA256 9019b73ee387eae8682290221f5b0ac3e6e3fe9cfc6f502ea0c0a600811ac34d
MD5 a1b16fa67eb543ba8fcf2bc6d0e3f306
BLAKE2b-256 bfcc0099fca7bcb25dc7fba2fe05a9040814eaaf531556fab670f276f65de7d7

See more details on using hashes here.

File details

Details for the file deeprob_kit-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: deeprob_kit-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 111.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.9

File hashes

Hashes for deeprob_kit-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b66ab3b5fd14221ae37388d27ad2e1bc5edfe39ade8b789030b5a59f245b7639
MD5 5f6cb69b5280bf2c205ccb445bcb7974
BLAKE2b-256 8e4b79f9f4026e85b7147178bc23dc714438e1e8dc208e0ed24344cd00f79c16

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page