Skip to main content

A package of mixture models including Skew GMM, GMN and DGMM

Project description

Mixes - repository of mixture models

This repository was created as part of Research paper "Estimation of Gaussian Mixture Networks" required as part of Master of Mathematics degree in Computational Mathematics at the University of Waterloo authored by Andriy Dmytruk and supervised by Ryan Browne.

The repository includes implementation of the following mixture models:

  • Gaussian Mixture Model (GMM)
  • Skew Gaussian Mixture Model (Skew GMM)
  • Deep Gaussian Misture Modle (DGMM)
  • Gaussian Mixture Network (GMN)

Usage

The implementation is present in the mixes/ folder. You can see an example of usage in the experiments/example.ipynb jupyter notebook.

All the experiments that were performed as part of the research paper can also be found inside the experiments/ folder.

Models description

Skew Gaussian Mixture Model

Skew GMM was implemented based on paper "Maximum likelihood estimation for multivariate skew normal mixture models" by Tsung I. Lin (2006).

Gaussian Mixture Network

GMN was proposed in the author's research paper. The model creates a network of gaussian distributions where next layers in the model have conditional probability distribution based on the previous layer. Each layer is a mixture of components, therefore the whole model creates a network of gaussian nodes.

The most important parameters are:

  • layer_sizes - these are the sizes of layers. The first layer will be used for clusterization and therefore its size should correspond to the desired number of clusters.
  • layer_dims - the input dimensions of each layer. Each layer has an input and output dimension. The output of the first layer is automatically set to the dimensionality of the data, and output of prevous layer is considered the input of the next one. By reducing dimensionality of the deeper layers have fewer parameters, and each layer becomes similar to Mixtures of Factor Analyzers. You would probably want to set the dimensions in a non-increasing order.
  • init - determines how the model is initialized. Use kmeans (default) for initialization by K-Means and factor analysis on each layer. Use random for a completely random initialization.

Deep Gaussian Mixture Model

DGMM is based on papers "Deep Gaussian mixture models" by Cinzia Viroli, Geoffrey J. McLachlan (2019) and "Factoring Variations in Natural Images with Deep Gaussian Mixture Models" by Aaron van den Oord, Benjamin Schrauwen (2014).

The parameters are similar to GMN model, as is the implementation in this repository.

The difference between DGMM and GMN is that GMN gives probabilities to layer's components conditional on the previous layer, while DGMM has them independent.

Annealing

We implemented deterministic annealing for mixture models as described in the paper "On the Bumpy Road to the Dominant Mode" by Hua Zhou, Kenneth L. Lange (2010).

Since the log-likelihood functions is frequently non-concave, the EM algorithm can end up in suboptimal modes. The idea of annealing is to flatten the objective function and therefore increase the chances of terminating in a dominant mode.

The parameter use_annealing determines whether to use annealing, while the parameter annealing_start_v determines the intial value for annealing. The value must be between 0 and 1. Lower values correspond to a more flattened objective function, while 1 corresponds to no annealing. Starting for the annealing_start_v, the annealing value will be increased to 1 during model fitting if use_annealing is set to true.

Regularizatoin

GMM, GMN and DGMM models have the variance regularization parameter var_regularization. Regularization makes the covariances larger on each step. This keeps the covariance matrix from becoming close to singular, which would greatly degrade optimization for it. The parameter can also be used for restricting the model to larger covariances and avoid overfitting.

Stopping criterion

Use the stopping_criterion parameter in models to specify a stopping criterion. Specified function must have the same signature as functions in the mixes/stopping_criterion.py file.

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mixes-1.0.tar.gz (23.0 kB view details)

Uploaded Source

Built Distribution

mixes-1.0-py3-none-any.whl (24.0 kB view details)

Uploaded Python 3

File details

Details for the file mixes-1.0.tar.gz.

File metadata

  • Download URL: mixes-1.0.tar.gz
  • Upload date:
  • Size: 23.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for mixes-1.0.tar.gz
Algorithm Hash digest
SHA256 a374c438693ad31390e34c85b8316bcc5f65953ebeb080dec8a5329a2251b4a0
MD5 1474b936ec1f081f9be1e6639427cce8
BLAKE2b-256 45810b09194f17947e50160ef2123f59c3d5cb75ef6a3bef1cd546d9606b99fc

See more details on using hashes here.

File details

Details for the file mixes-1.0-py3-none-any.whl.

File metadata

  • Download URL: mixes-1.0-py3-none-any.whl
  • Upload date:
  • Size: 24.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for mixes-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2605dbc23c07eb1c6da657bcaaa33f1a64207440fe9305c7b39cf63ad54b13a0
MD5 0d54baf23497800dab74ef2bff6fa5df
BLAKE2b-256 2ae335a2d41d7436d9dee1a775f9a2587361ea9235ee5b3cb527268f118027d7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page