Distances and divergences between distributions implemented in python.

## Project description

Distances and divergences between distributions implemented in python.

## How do I install this package?

pip install dictances

## Tests Coverage

Since some software handling coverages sometime get slightly different results, here’s three of them:

## Available metrics

A number of distances and divergences are available:

Distances

Methods

Bhattacharyya distance

bhattacharyya

Bhattacharyya coefficient

bhattacharyya_coefficient

Canberra distance

canberra

Chebyshev distance

chebyshev

Chi Square distance

chi_square

Cosine Distance

cosine

Euclidean distance

euclidean

Hamming distance

hamming

Jensen-Shannon divergence

jensen_shannon

Kullback-Leibler divergence

kullback_leibler

Mean absolute error

mae

Taxicab geometry

manhattan, cityblock, total_variation

Minkowski distance

minkowsky

Mean squared error

mse

Pearson’s distance

pearson

Squared deviations from the mean

squared_variation

## Usage example

from dictances import cosine

cosine(my_first_dictionary, my_second_dictionary)

## Handling nested dictionaries

If you need to compute the distance between two nested dictionaries you can use deflate_dict as follows:

from dictances import cosine
from deflate_dict import deflate

my_first_dictionary = {
"a": 8,
"b": {
"c": 3,
"d": 6
}
}

my_second_dictionary = {
"b": {
"c": 8,
"d": 1
},
"y": 3,

}

cosine(deflate(my_first_dictionary), deflate(my_second_dictionary))

## Project details

Uploaded Source