Skip to main content

Entropy and K-L divergence on GPU via PyOpenCL

Project description

futhark-kullback-liebler

The Kullback-Liebler divergence in Futhark

Benchmarks

To run the benchmarks:

make
pipenv run python harness.py

and

futhark bench information.fut --backend opencl --runs=100

Comparison

Computation Array Size Implementation Time
Entropy 10000000 Futhark 27.41 ms
Kullback-Liebler Divergence 10000000 Futhark 19.61 ms
Entropy 10000000 Python + Futhark 52.80 ms
Kullback-Liebler Divergence 10000000 Python + Futhark 94.07 ms
Entropy 10000000 Python (SciPy) 233.45 ms
Kullback-Liebler Divergence 10000000 Python (SciPy) 340.83 ms
Entropy 10000000 J 227.37 ms

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for entropy-gpu, version 0.2.0
Filename, size File type Python version Upload date Hashes
Filename, size entropy_gpu-0.2.0-py3-none-any.whl (35.5 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size entropy-gpu-0.2.0.tar.gz (33.2 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page