Python package for information theory.
Project description
dit is a Python package for information theory.
- Documentation:
- Downloads:
- Dependencies:
Python 2.7, 3.3, 3.4, 3.5, or 3.6
boltons
contextlib2
debtcollector
networkx
numpy
prettytable
scipy
six
- Optional Dependencies:
colorama
cython
numdifftools
scikit-learn
- Note:
The cython extensions are currently not supported on windows. Please install using the --nocython option.
- Install:
The easiest way to install is:
pip install dit
Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:
git clone https://github.com/dit/dit.git cd dit pip install .
- Mailing list:
None
- Code and bug tracker:
- License:
BSD 2-Clause, see LICENSE.txt for details.
Implemented Measures
dit implements the following information measures. Most of these are implemented in multivariate & conditional generality, where such generalizations either exist in the literature or are relatively obvious — for example, though it is not in the literature, the multivariate conditional exact common information is implemented here.
Entropies:
Shannon Entropy
Renyi Entropy
Tsallis Entropy
Necessary Conditional Entropy
Residual Entropy / Independent Information / Variation of Information
Mutual Informations:
Co-Information
Interaction Information
Total Correlation / Multi-Information
Dual Total Correlation / Binding Information
CAEKL Multivariate Mutual Information
Divergences
Variational Distance
Kullback-Leibler Divergence
Cross Entropy
Jensen-Shannon Divergence
Common Informations:
Gacs-Korner Common Information
Wyner Common Information
Exact Common Information
Functional Common Information
MSS Common Information
Secret Key Agreement bounds:
Intrinsic Mutual Information
Reduced Intrinsic Mutual Information
Minimal Intrinsic Mutual Information
Necessary Intrinsic Mutual Information
Partial Information Decompositions:
\(I_{min}\)
\(I_{\wedge}\)
\(I_{\downarrow}\)
\(I_{proj}\)
\(I_{BROJA}\)
\(I_{ccs}\)
\(I_{\pm}\)
\(I_{dep}\)
Other measures
Channel Capacity
Complexity Profile
Connected Informations
Cumulative Residual Entropy
Extropy
Information Diagrams
Information Trimming
Lautum Information
LMPR Complexity
Marginal Utility of Information
Maximum Correlation
Maximum Entropy Distributions
Perplexity
TSE Complexity
Quickstart
The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:
>>> import dit
Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.
>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print d
Class: Distribution
Alphabet: ('E', 'H', 'T') for all rvs
Base: linear
Outcome Class: str
Outcome Length: 1
RV Names: None
x p(x)
E 0.2
H 0.4
T 0.4
Calculate the probability of H and also of the combination H or T.
>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8
Calculate the Shannon entropy and extropy of the joint distribution.
>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373
Create a distribution where Z = xor(X, Y).
>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print d
Class: Distribution
Alphabet: ('0', '1') for all rvs
Base: linear
Outcome Class: str
Outcome Length: 3
RV Names: ('X', 'Y', 'Z')
x p(x)
000 0.25
011 0.25
101 0.25
110 0.25
Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].
>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0
Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.
>>> d2 = d.marginal(['X', 'Z'])
>>> print d2.to_string(show_mask=True, exact=True)
Class: Distribution
Alphabet: ('0', '1') for all rvs
Base: linear
Outcome Class: str
Outcome Length: 2 (mask: 3)
RV Names: ('X', 'Z')
x p(x)
0*0 1/4
0*1 1/4
1*0 1/4
1*1 1/4
Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.
>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])
Draw 5 random samples from this distribution.
>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']
Enjoy!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Hashes for dit-1.0.0-cp36-cp36m-macosx_10_12_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f71b0855849a06e704776c8743207208918995797f3c97aec898a5e16072f985 |
|
MD5 | 71d4d9169367972db80429af35484645 |
|
BLAKE2b-256 | c12970170b5e020dc9a8fc6fa0a9532082c24bb8b380184acad92398973098aa |
Hashes for dit-1.0.0-cp27-cp27m-macosx_10_12_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1f9bf6c12c3961f48906a642c9b686acc07eb3b74c0393f48f7033830ed00aa7 |
|
MD5 | 68f10cd3a0c96a8336d71b1ed2f79a86 |
|
BLAKE2b-256 | 6f567fdb78fd655e1be956265f51e2e41904cdcb3a4d6372c1dd1ec9cc94e048 |