Entropy and K-L divergence on GPU via PyOpenCL
Project description
# futhark-kullback-liebler
The Kullback-Liebler divergence, Hellinger distance, and alpha-divergence in Futhark.
## Documentation
Documentation is available [here](https://vmchale.github.io/kullback-liebler/).
## Benchmarks
To run the benchmarks:
` make pipenv run python harness.py `
and
` futhark bench information.fut --backend opencl --runs=100 `
### Comparison
Computation | Array Size | Implementation | Time |
———– | ———- | ————– | —- |
Entropy | 10000000 | Futhark | 27.41 ms |
Kullback-Liebler Divergence | 10000000 | Futhark | 19.61 ms |
Entropy | 10000000 | Python + Futhark | 52.80 ms |
Kullback-Liebler Divergence | 10000000 | Python + Futhark | 94.07 ms |
Entropy | 10000000 | Python (SciPy) | 233.45 ms |
Kullback-Liebler Divergence | 10000000 | Python (SciPy) | 340.83 ms |
Entropy | 10000000 | J | 227.37 ms |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
entropy-gpu-0.3.0.tar.gz
(39.2 kB
view hashes)
Built Distribution
Close
Hashes for entropy_gpu-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bc2e9049f95620eb1bdf0844d047845ea92281984b5f0786dea64c3215fd1539 |
|
MD5 | 95561ebac7fafc1317b45edeb434faf9 |
|
BLAKE2b-256 | f57576ad91be214a20332b63266ca03dfb8ba99a96cfc500a61b2aba68c979bc |