Skip to main content

A framework for discrete-time Markov chains analysis.

Project description

PyDTMC is a full-featured, lightweight library for discrete-time Markov chains analysis. It provides classes and functions for creating, manipulating, simulating and visualizing markovian stochastic processes.

Status: Build Docs Coverage
Info: License Lines Size
PyPI: Version Python Wheel Downloads
Conda: Version Python Platforms Downloads

Requirements

The Python environment must include the following packages:

The package Sphinx is required for building the package documentation. The package pytest is required for performing unit tests. For a better user experience, it's recommended to install Graphviz and pydot before using the plot_graph function.

Installation & Upgrade

PyPI:

$ pip install PyDTMC
$ pip install --upgrade PyDTMC

Conda:

$ conda install -c tommasobelluzzo pydtmc
$ conda update -c tommasobelluzzo pydtmc

GitHub:

$ pip install https://github.com/TommasoBelluzzo/PyDTMC/tarball/master
$ pip install --upgrade https://github.com/TommasoBelluzzo/PyDTMC/tarball/master

$ pip install git+https://github.com/TommasoBelluzzo/PyDTMC.git#egg=PyDTMC
$ pip install --upgrade git+https://github.com/TommasoBelluzzo/PyDTMC.git#egg=PyDTMC

Usage

The core element of the library is the MarkovChain class, which can be instantiated as follows:

>>> p = [[0.2, 0.7, 0.0, 0.1], [0.0, 0.6, 0.3, 0.1], [0.0, 0.0, 1.0, 0.0], [0.5, 0.0, 0.5, 0.0]]
>>> mc = MarkovChain(p, ['A', 'B', 'C', 'D'])
>>> print(mc)

DISCRETE-TIME MARKOV CHAIN
 SIZE:           4
 RANK:           4
 CLASSES:        2
  > RECURRENT:   1
  > TRANSIENT:   1
 ERGODIC:        NO
  > APERIODIC:   YES
  > IRREDUCIBLE: NO
 ABSORBING:      YES
 REGULAR:        NO
 REVERSIBLE:     NO

Below a few examples of MarkovChain properties:

>>> print(mc.is_ergodic)
False

>>> print(mc.recurrent_states)
['C']

>>> print(mc.transient_states)
['A', 'B', 'D']

>>> print(mc.steady_states)
[array([0.0, 0.0, 1.0, 0.0])]

>>> print(mc.is_absorbing)
True

>>> print(mc.fundamental_matrix)
[[1.50943396 2.64150943 0.41509434]
 [0.18867925 2.83018868 0.30188679]
 [0.75471698 1.32075472 1.20754717]]
 
>>> print(mc.kemeny_constant)
5.547169811320755

>>> print(mc.entropy_rate)
0.0

Below a few examples of MarkovChain methods:

>>> print(mc.mean_absorption_times())
[4.56603774 3.32075472 3.28301887]

>>> print(mc.absorption_probabilities())
[1.0 1.0 1.0]

>>> print(mc.expected_rewards(10, [2, -3, 8, -7]))
[-2.76071635, -12.01665113, 23.23460025, -8.45723276]

>>> print(mc.expected_transitions(2))
[[0.085, 0.2975, 0.0,    0.0425]
 [0.0,   0.345,  0.1725, 0.0575]
 [0.0,   0.0,    0.7,    0.0   ]
 [0.15,  0.0,    0.15,   0.0   ]]

>>> print(mc.first_passage_probabilities(5, 3))
[[0.5, 0.0,    0.5,    0.0   ]
 [0.0, 0.35,   0.0,    0.05  ]
 [0.0, 0.07,   0.13,   0.045 ]
 [0.0, 0.0315, 0.1065, 0.03  ]
 [0.0, 0.0098, 0.0761, 0.0186]]
 
>>> print(mc.hitting_probabilities([0, 1]))
[1.0, 1.0, 0.0, 0.5]
 
>>> print(mc.walk(10))
['B', 'B', 'B', 'D', 'A', 'B', 'B', 'C', 'C', 'C']

Plotting functions can provide a visual representation of MarkovChain instances; in order to display the output of plots immediately, the interactive mode of Matplotlib must be turned on:

>>> plot_eigenvalues(mc)

Eigenplot

>>> plot_graph(mc)

Graphplot

>>> plot_walk(mc, 10, 'sequence')

Walkplot

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

PyDTMC-6.2.0.tar.gz (74.1 kB view hashes)

Uploaded Source

Built Distribution

PyDTMC-6.2.0-py3-none-any.whl (47.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page