Tools for fast and robust univariate and multivariate kernel density estimation

## Software Overview

fastKDE calculates a kernel density estimate of arbitrarily dimensioned data; it does so rapidly and robustly using recently developed KDE techniques. It does so with statistical skill that is as good as state-of-the-science ‘R’ KDE packages, and it does so 10,000 times faster for bivariate data (even better improvements for higher dimensionality).

Please cite the following papers when using this method:

O’Brien, T. A., Kashinath, K., Cavanaugh, N. R., Collins, W. D. & O’Brien, J. P. A fast and objective multidimensional kernel density estimation method: fastKDE. Comput. Stat. Data Anal. 101, 148–160 (2016).

O’Brien, T. A., Collins, W. D., Rauscher, S. A. & Ringler, T. D. Reducing the computational cost of the ECF using a nuFFT: A fast and objective probability density estimation method. Comput. Stat. Data Anal. 79, 222–234 (2014).

Example usage:

### For a standard PDF

```#!python

import numpy as np
from fastkde import fastKDE
import pylab as PP

#Generate two random variables dataset (representing 100000 pairs of datapoints)
N = 2e5
var1 = 50*np.random.normal(size=N) + 0.1
var2 = 0.01*np.random.normal(size=N) - 300

#Do the self-consistent density estimate
myPDF,axes = fastKDE.pdf(var1,var2)

#Extract the axes from the axis list
v1,v2 = axes

#Plot contours of the PDF should be a set of concentric ellipsoids centered on
#(0.1, -300) Comparitively, the y axis range should be tiny and the x axis range
#should be large
PP.contour(v1,v2,myPDF)
PP.show()
```

### For a conditional PDF

The following code generates samples from a non-trivial joint distribution

```from fastkde import fastKDE
import pylab as PP
import numpy as np

#***************************
# Generate random samples
#***************************
# Stochastically sample from the function underlyingFunction() (a sigmoid):
# sample the absicissa values from a gamma distribution
# relate the ordinate values to the sample absicissa values and add
# noise from a normal distribution

#Set the number of samples
numSamples = int(1e6)

#Define a sigmoid function
def underlyingFunction(x,x0=305,y0=200,yrange=4):
return (yrange/2)*np.tanh(x-x0) + y0

xp1,xp2,xmid = 5,2,305  #Set gamma distribution parameters
yp1,yp2 = 0,12          #Set normal distribution parameters (mean and std)

#Generate random samples of X from the gamma distribution
x = -(np.random.gamma(xp1,xp2,int(numSamples))-xp1*xp2) + xmid
#Generate random samples of y from x and add normally distributed noise
y = underlyingFunction(x) + np.random.normal(loc=yp1,scale=yp2,size=numSamples)
```

Now that we have the x,y samples, the following code calcuates the conditional

```#***************************
# Calculate the conditional
#***************************
pOfYGivenX,axes = fastKDE.conditional(y,x)
```

The following plot shows the results:

```#***************************
# Plot the conditional
#***************************
fig,axs = PP.subplots(1,2,figsize=(10,5))

#Plot a scatter plot of the incoming data
axs.plot(x,y,'k.',alpha=0.1)
axs.set_title('Original (x,y) data')

#Set axis labels
for i in (0,1):
axs[i].set_xlabel('x')
axs[i].set_ylabel('y')

#Draw a contour plot of the conditional
axs.contourf(axes,axes,pOfYGivenX,64)
#Overplot the original underlying relationship
axs.plot(axes,underlyingFunction(axes),linewidth=3,linestyle='--',alpha=0.5)
axs.set_title('P(y|x)')

#Set axis limits to be the same
xlim = [np.amin(axes),np.amax(axes)]
ylim = [np.amin(axes),np.amax(axes)]
axs.set_xlim(xlim)
axs.set_ylim(ylim)
axs.set_xlim(xlim)
axs.set_ylim(ylim)

fig.tight_layout()

PP.savefig('conditional_demo.png')
PP.show()
``` Conditional PDF

## How do I get set up?

A standard python build: python setup.py install

or

pip install fastkde

### Install pre-requisites

This code requires the following software:

• Python >= 2.7.3
• Numpy >= 1.7
• scipy
• cython

## Project details

This version 1.0.13 1.0.12 1.0.11 1.0.10 1.0.9 1.0.8 1.0.7