Skip to main content

SpKit: Signal Processing toolkit | Nikesh Bajaj |

# Signal Processing toolkit

### Links: Homepage | Documentation | Github | PyPi - project | _ Installation:pip install spkit ## New Updates

<mark>Version: 0.0.9.2</mark>

• Added Scalogram with CWT functions <mark>Version: 0.0.9.1</mark>

#### <mark>New Updates</mark>:: Decision Tree View Notebooks

<mark>Version: 0.0.7</mark>

• Analysing the performance measure of trained tree at different depth - with ONE-TIME Training ONLY
• Optimize the depth of tree
• Shrink the trained tree with optimal depth
• Plot the Learning Curve
• Classification: Compute the probability and counts of label at a leaf for given example sample
• Regression: Compute the standard deviation and number of training samples at a leaf for given example sample  • <mark>Version: 0.0.6</mark>: Works with catogorical features without converting them into binary vector
• <mark>Version: 0.0.5</mark>: Toy examples to understand the effect of incresing max_depth of Decision Tree  ## Installation

Requirement: numpy, matplotlib, scipy.stats, scikit-learn

### with pip

``````pip install spkit
``````

### update with pip

``````pip install spkit --upgrade
``````

### Build from the source

Download the repository or clone it with git, after cd in directory build it from source with

``````python setup.py install
``````

## Functions list

#### Signal Processing Techniques

Information Theory functions for real valued signals

• Entropy : Shannon entropy, RÃ©nyi entropy of order Î±, Collision entropy
• Joint entropy
• Conditional entropy
• Mutual Information
• Cross entropy
• Kullbackâ€“Leibler divergence
• Computation of optimal bin size for histogram using FD-rule
• Plot histogram with optimal bin size

Matrix Decomposition

• SVD
• ICA using InfoMax, Extended-InfoMax, FastICA & Picard

Linear Feedback Shift Register

• pylfsr

Continuase Wavelet Transform and other functions comming soon..

#### Machine Learning models - with visualizations

• Logistic Regression
• Naive Bayes
• Decision Trees
• DeepNet (to be updated)

# Examples

## Scalogram CWT

``````import numpy as np
import matplotlib.pyplot as plt
import spkit as sp
from spkit.cwt import ScalogramCWT
from spkit.cwt import compare_cwt_example

x,fs = sp.load_data.eegSample_1ch()
t = np.arange(len(x))/fs
print(x.shape, t.shape)
compare_cwt_example(x,t,fs=fs)
`````` ``````f0 = np.linspace(0.1,10,100)
Q  = np.linspace(0.1,5,100)
XW,S = ScalogramCWT(x,t,fs=fs,wType='Gauss',PlotPSD=True,f0=f0,Q=Q)
`````` ## Information Theory

### View in notebook

``````import numpy as np
import matplotlib.pyplot as plt
import spkit as sp

x = np.random.rand(10000)
y = np.random.randn(10000)

#Shannan entropy
H_x= sp.entropy(x,alpha=1)
H_y= sp.entropy(y,alpha=1)

#RÃ©nyi entropy
Hr_x= sp.entropy(x,alpha=2)
Hr_y= sp.entropy(y,alpha=2)

H_xy= sp.entropy_joint(x,y)

H_x1y= sp.entropy_cond(x,y)
H_y1x= sp.entropy_cond(y,x)

I_xy = sp.mutual_Info(x,y)

H_xy_cross= sp.entropy_cross(x,y)

D_xy= sp.entropy_kld(x,y)

print('Shannan entropy')
print('Entropy of x: H(x) = ',H_x)
print('Entropy of y: H(y) = ',H_y)
print('-')
print('RÃ©nyi entropy')
print('Entropy of x: H(x) = ',Hr_x)
print('Entropy of y: H(y) = ',Hr_y)
print('-')
print('Mutual Information I(x,y) = ',I_xy)
print('Joint Entropy H(x,y) = ',H_xy)
print('Conditional Entropy of : H(x|y) = ',H_x1y)
print('Conditional Entropy of : H(y|x) = ',H_y1x)
print('-')
print('Cross Entropy of : H(x,y) = :',H_xy_cross)
print('Kullbackâ€“Leibler divergence : Dkl(x,y) = :',D_xy)

plt.figure(figsize=(12,5))
plt.subplot(121)
sp.HistPlot(x,show=False)

plt.subplot(122)
sp.HistPlot(y,show=False)
plt.show()
``````

## Independent Component Analysis

### View in notebook

``````from spkit import ICA
from spkit.data import load_data
X,ch_names = load_data.eegSample()

x = X[128*10:128*12,:]
t = np.arange(x.shape)/128.0

ica = ICA(n_components=14,method='fastica')
ica.fit(x.T)
s1 = ica.transform(x.T)

ica = ICA(n_components=14,method='infomax')
ica.fit(x.T)
s2 = ica.transform(x.T)

ica = ICA(n_components=14,method='picard')
ica.fit(x.T)
s3 = ica.transform(x.T)

ica = ICA(n_components=14,method='extended-infomax')
ica.fit(x.T)
s4 = ica.transform(x.T)
``````

## Machine Learning

### Logistic Regression - View in notebook ### Naive Bayes - View in notebook ### Decision Trees - View in notebook  #### Plottng tree while training **view in repository **

## Linear Feedback Shift Register ``````import numpy as np
from spkit.pylfsr import LFSR
## Example 1  ## 5 bit LFSR with x^5 + x^2 + 1
L = LFSR()
L.info()
L.next()
L.runKCycle(10)
L.runFullCycle()
L.info()
tempseq = L.runKCycle(10000)    # generate 10000 bits from current state
``````

# Contacts:

## Release history Release notifications | RSS feed

This version 0.0.9.2 0.0.9.1 0.0.8 0.0.7 0.0.6 0.0.5 0.0.4 0.0.3 0.0.2 0.0.1

## Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for spkit, version 0.0.9.2
Filename, size File type Python version Upload date Hashes
Filename, size spkit-0.0.9.2-py3-none-any.whl (1.3 MB) File type Wheel Python version py3 Upload date Hashes
Filename, size spkit-0.0.9.2.tar.gz (1.3 MB) File type Source Python version None Upload date Hashes