Python package for information theory.

## Project description

dit is a Python package for information theory.

## Introduction

Information theory is a powerful extension to probability and statistics, quantifying dependencies among arbitrary random variables in a way that is consistent and comparable across systems and scales. Information theory was originally developed to quantify how quickly and reliably information could be transmitted across an arbitrary channel. The demands of modern, data-driven science have been coopting and extending these quantities and methods into unknown, multivariate settings where the interpretation and best practices are not known. For example, there are at least four reasonable multivariate generalizations of the mutual information, none of which inherit all the interpretations of the standard bivariate case. Which is best to use is context-dependent. dit implements a vast range of multivariate information measures in an effort to allow information practitioners to study how these various measures behave and interact in a variety of contexts. We hope that having all these measures and techniques implemented in one place will allow the development of robust techniques for the automated quantification of dependencies within a system and concrete interpretation of what those dependencies mean.

## Citing

```@article{dit,
Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.},
Title = {{dit}: a {P}ython package for discrete information theory},
Journal = {The Journal of Open Source Software},
Volume = {3},
Number = {25},
Pages = {738},
Year = {2018},
Doi = {https://doi.org/10.21105/joss.00738}
}
```

## Basic Information

### Documentation

http://docs.dit.io

https://pypi.org/project/dit/

https://anaconda.org/conda-forge/dit

Dependencies

#### Optional Dependencies

• colorama: colored column heads in PID indicating failure modes
• cython: faster sampling from distributions
• hypothesis: random sampling of distributions
• matplotlib, python-ternary: plotting of various information-theoretic expansions
• numdifftools: numerical evaluation of gradients and hessians during optimization
• pint: add units to informational values
• scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples

### Install

The easiest way to install is:

```pip install dit
```

If you want to install dit within a conda environment, you can simply do:

```conda install -c conda-forge dit
```

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:

```git clone https://github.com/dit/dit.git
cd dit
pip install .
```

Note

The cython extensions are currently not supported on windows. Please install using the --nocython option.

### Testing

```\$ git clone https://github.com/dit/dit.git
\$ cd dit
\$ pip install -r requirements_testing.txt
\$ py.test
```

### Code and bug tracker

https://github.com/dit/dit

BSD 3-Clause, see LICENSE.txt for details.

## Implemented Measures

dit implements the following information measures. Most of these are implemented in multivariate & conditional generality, where such generalizations either exist in the literature or are relatively obvious — for example, though it is not in the literature, the multivariate conditional exact common information is implemented here.

 Entropies Shannon Entropy Renyi Entropy Tsallis Entropy Necessary Conditional Entropy Residual Entropy / Independent Information / Variation of Information Mutual Informations Co-Information Interaction Information Total Correlation / Multi-Information Dual Total Correlation / Binding Information CAEKL Multivariate Mutual Information Divergences Variational Distance Kullback-Leibler Divergence Relative Entropy Cross Entropy Jensen-Shannon Divergence Earth Mover’s Distance Other Measures Channel Capacity Complexity Profile Connected Informations Copy Mutual Information Cumulative Residual Entropy Extropy Hypercontractivity Coefficient Information Bottleneck Information Diagrams Information Trimming Lautum Information LMPR Complexity Marginal Utility of Information Maximum Correlation Maximum Entropy Distributions Perplexity Rate-Distortion Theory TSE Complexity Common Informations Gacs-Korner Common Information Wyner Common Information Exact Common Information Functional Common Information MSS Common Information Partial Information Decomposition I_{min} I_{wedge} I_{RR} I_{downarrow} I_{proj} I_{BROJA} I_{ccs} I_{pm} I_{dep} I_{RAV} I_{mmi} I_{prec} I_{RA} I_{SKAR} Secret Key Agreement Bounds Secrecy Capacity Intrinsic Mutual Information Reduced Intrinsic Mutual Information Minimal Intrinsic Mutual Information Necessary Intrinsic Mutual Information Two-Part Intrinsic Mutual Information

## Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

```>>> import dit
```

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

```>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print(d)
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4
```

Calculate the probability of H and also of the combination H or T.

```>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8
```

Calculate the Shannon entropy and extropy of the joint distribution.

```>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373
```

Create a distribution where Z = xor(X, Y).

```>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print(d)
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25
```

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

```>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0
```

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

```>>> d2 = d.marginal(['X', 'Z'])
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4
```

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

```>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])
```

Draw 5 random samples from this distribution.

```>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']
```

## Contributions & Help

If you’d like to feature added to dit, please file an issue. Or, better yet, open a pull request. Ideally, all code should be tested and documented, but please don’t let this be a barrier to contributing. We’ll work with you to ensure that all pull requests are in a mergable state.

If you’d like to get in contact about anything, you can reach us through our slack channel.

## Project details

This version 1.5 1.2.3 1.2.1 1.2.0 1.1.0 1.0.2 1.0.1 1.0.0 1.0.0.dev27 pre-release 1.0.0.dev26 pre-release 1.0.0.dev25 pre-release 1.0.0.dev24 pre-release 1.0.0.dev23 pre-release 1.0.0.dev22 pre-release 1.0.0.dev21 pre-release 1.0.0.dev20 pre-release 1.0.0.dev19 pre-release 1.0.0.dev18 pre-release 1.0.0.dev17 pre-release 1.0.0.dev16 pre-release 1.0.0.dev15 pre-release 1.0.0.dev14 pre-release 1.0.0.dev13 pre-release 1.0.0.dev12 pre-release 1.0.0.dev11 pre-release 1.0.0.dev10 pre-release 1.0.0.dev9 pre-release 1.0.0.dev8 pre-release 1.0.0.dev7 pre-release 1.0.0.dev6 pre-release 1.0.0.dev5 pre-release 1.0.0.dev4 pre-release 1.0.0.dev3 pre-release 1.0.0.dev2 pre-release 1.0.0.dev1 pre-release 1.0.0.dev0 pre-release

Uploaded `source`
Uploaded `py3`