Distances and divergences between distributions implemented in python.

## Project description

Distances and divergences between distributions implemented in python.

## How do I install this package?

```pip install dictances
```

## Available metrics

A number of distances and divergences are available:

Distances Methods
Bhattacharyya distance `bhattacharyya`
Bhattacharyya coefficient `bhattacharyya_coefficient`
Canberra distance `canberra`
Chebyshev distance `chebyshev`
Chi Square distance `chi_square`
Cosine Distance `cosine`
Euclidean distance `euclidean`
Hamming distance `hamming`
Jensen-Shannon divergence `jensen_shannon`
Kullback-Leibler divergence `kullback_leibler`
Mean absolute error `mae`
Taxicab geometry `manhattan, cityblock, total_variation`
Minkowski distance `minkowsky`
Mean squared error `mse`
Pearson’s distance `pearson`
Squared deviations from the mean `squared_variation`

## Usage example

```from dictances import cosine

cosine(my_first_dictionary, my_second_dictionary)
```

## Handling nested dictionaries

If you need to compute the distance between two nested dictionaries you can use deflate_dict as follows:

```from dictances import cosine
from deflate_dict import deflate

my_first_dictionary = {
"a": 8,
"b": {
"c": 3,
"d": 6
}
}

my_second_dictionary = {
"b": {
"c": 8,
"d": 1
},
"y": 3,

}

cosine(deflate(my_first_dictionary), deflate(my_second_dictionary))
```

## Project details

This version 1.5.2 1.5.1 1.5.0 1.4.4 1.4.3 1.4.2 1.4.1 1.4.0 1.3.0 1.2.0 1.1.4 1.1.3 1.1.2 1.1.1 1.1.0