Skip to main content

A non-parametric Bayesian approach to Hidden Markov Models

Project description

Bayesian Hidden Markov Models

Build Status

This code implements a non-parametric Bayesian Hidden Markov model, sometimes referred to as a Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM), or an Infinite Hidden Markov Model (iHMM). This package has capability for a standard non-parametric Bayesian HMM, as well as a sticky HDPHMM (see references). Inference is performed via Markov chain Monte Carlo estimation, including efficient beam sampling for the latent sequence resampling steps, and multithreading when possible for parameter resampling.

Installation

The current version is development only, and installation is only recommended for people who are aware of the risks. It can be installed through PyPI:

pip install bayesian-hmm

Hidden Markov Models

Hidden Markov Models are powerful time series models, which use latent variables to explain observed emission sequences. The result is a generative model for time series data, which is often tractable and can be easily understood. The latent series is assumed to be a Markov chain, which requires a starting distribution and transition distribution, as well as an emission distribution to tie emissions to latent states. Traditional parametric Hidden Markov Models use a fixed number of states for the latent series Markov chain. Hierarchical Dirichlet Process Hidden Markov Models (including the one implemented by bayesian_hmm package) allow for the number of latent states to vary as part of the fitting process. This is done by using a hierarchical Dirichlet prior on the latent state starting and transition distributions, and performing MCMC sampling on the latent states to estimate the model parameters.

Usage

Basic usage allows us to supply a list of emission sequences, initialise the HDPHMM, and perform MCMC estimation. The below example constructs some artificial observation series, and uses a brief MCMC estimation step to estimate the model parameters. We use a moderately sized data to showcase the speed of the package: 50 sequences of length 200, with 500 MCMC steps.

import bayesian_hmm

# create emission sequences
base_sequence = list(range(5)) + list(range(5, 0, -1))
sequences = [base_sequence * 20 for _ in range(50)]

# initialise object with overestimate of true number of latent states
hmm = bayesian_hmm.HDPHMM(sequences, sticky=False)
hmm.initialise(k=20)

# estimate parameters, making use of multithreading functionality
results = hmm.mcmc(n=500, burn_in=100)

# print final probability estimates (expect 10 latent states)
hmm.print_probabilities()

The bayesian_hmm package can handle more advanced usage, including:

  • Multiple emission sequences,
  • Emission series of varying length,
  • Any categorical emission distribution,
  • Multithreaded MCMC estimation, and
  • Starting probability estimation, which share a dirichlet prior with the transition probabilities.

Inference

This code uses an MCMC approach to parameter estimation. We use efficient Beam sampling on the latent sequences, as well as Metropolis Hastings sampling on each of the hyperparameters. We approximate true resampling steps by using probability estimates calculated on all states of interest, rather than the leaving probabilities unadjusted for current variable resampling steps (rather than removing the current) variable for the sampled estimate.

Outstanding issues and future work

We have the following set as a priority to improve in the future:

  • Expand package to include standard non-Bayesian HMM functions, such as Baum Welch and Viterbi algorithm
  • Allow for missing or NULL emissions which do not impact the model probability
  • Include functionality to use maximum likelihood estimates for the hyperparameters (currently only Metropolis Hastings resampling is possible for hyperparameters)

References

Van Gael, J., Saatci, Y., Teh, Y. W., & Ghahramani, Z. (2008, July). Beam sampling for the infinite hidden Markov model. In Proceedings of the 25th international conference on Machine learning (pp. 1088-1095). ACM.

Beal, M. J., Ghahramani, Z., & Rasmussen, C. E. (2002). The infinite hidden Markov model. In Advances in neural information processing systems (pp. 577-584).

Fox, E. B., Sudderth, E. B., Jordan, M. I., & Willsky, A. S. (2007). The sticky HDP-HMM: Bayesian nonparametric hidden Markov models with persistent states. Arxiv preprint.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bayesian_hmm-0.0.1.tar.gz (14.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bayesian_hmm-0.0.1-py3-none-any.whl (16.5 kB view details)

Uploaded Python 3

File details

Details for the file bayesian_hmm-0.0.1.tar.gz.

File metadata

  • Download URL: bayesian_hmm-0.0.1.tar.gz
  • Upload date:
  • Size: 14.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.6.7

File hashes

Hashes for bayesian_hmm-0.0.1.tar.gz
Algorithm Hash digest
SHA256 2046ce5da61c1af3a2466b8b9af6df3bcc65db12872ca7a4b8c62ce0793569d7
MD5 b427688129077fc336bc474372ceaab1
BLAKE2b-256 88f5c5e6e47c2123fe5e302c4c0316c1124b68ab4ed0b2c36082fb9f7c2b000f

See more details on using hashes here.

File details

Details for the file bayesian_hmm-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: bayesian_hmm-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 16.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.6.7

File hashes

Hashes for bayesian_hmm-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e0ea550652aab227a71b18e65e4fb4d7d669423400759bcc381b674839597525
MD5 464ca53c979d73471386c4f5a96c11d1
BLAKE2b-256 93486ccb440486ee82275955b609f8f868cff61b45455821bfa86b76ae9ed535

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page