Skip to main content

Entropy and K-L divergence on GPU via PyOpenCL

Project description

# futhark-kullback-liebler

The Kullback-Liebler divergence, Hellinger distance, and alpha-divergence in Futhark.

## Documentation

Documentation is available [here](https://vmchale.github.io/kullback-liebler/).

## Benchmarks

To run the benchmarks:

` make pipenv run python harness.py `

and

` futhark bench information.fut --backend opencl --runs=100 `

### Comparison

Computation | Array Size | Implementation | Time |
———– | ———- | ————– | —- |
Entropy | 10000000 | Futhark | 27.41 ms |
Kullback-Liebler Divergence | 10000000 | Futhark | 19.61 ms |
Entropy | 10000000 | Python + Futhark | 52.80 ms |
Kullback-Liebler Divergence | 10000000 | Python + Futhark | 94.07 ms |
Entropy | 10000000 | Python (SciPy) | 233.45 ms |
Kullback-Liebler Divergence | 10000000 | Python (SciPy) | 340.83 ms |
Entropy | 10000000 | J | 227.37 ms |

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

entropy-gpu-0.3.0.tar.gz (39.2 kB view hashes)

Uploaded Source

Built Distribution

entropy_gpu-0.3.0-py3-none-any.whl (42.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page