Skip to main content

SpKit: Signal Processing toolkit

Project description

Signal Processing toolkit

Links: Github | PyPi - project

Installation: pip install spkit

Updates:: Decision Tree


Table of contents


Installation

Requirement: numpy, matplotlib, scipy.stats, scikit-learn

with pip

pip install spkit

Build from the source

Download the repository or clone it with git, after cd in directory build it from source with

python setup.py install

Functions list

Signal Processing Techniques

Information Theory functions for real valued signals

  • Entropy : Shannon entropy, Rényi entropy of order α, Collision entropy
  • Joint entropy
  • Conditional entropy
  • Mutual Information
  • Cross entropy
  • Kullback–Leibler divergence
  • Computation of optimal bin size for histogram using FD-rule
  • Plot histogram with optimal bin size

Matrix Decomposition

  • SVD
  • ICA using InfoMax, Extended-InfoMax, FastICA & Picard

Linear Feedback Shift Register

  • pylfsr

Continuase Wavelet Transform and other functions comming soon..

Machine Learning models - with visualizations

  • Logistic Regression
  • Naive Bayes
  • Decision Trees
  • DeepNet (to be updated)

Examples

Information Theory

View in notebook

import numpy as np
import matplotlib.pyplot as plt
import spkit as sp

x = np.random.rand(10000)
y = np.random.randn(10000)

#Shannan entropy
H_x= sp.entropy(x,alpha=1)
H_y= sp.entropy(y,alpha=1)

#Rényi entropy
Hr_x= sp.entropy(x,alpha=2)
Hr_y= sp.entropy(y,alpha=2)

H_xy= sp.entropy_joint(x,y)

H_x1y= sp.entropy_cond(x,y)
H_y1x= sp.entropy_cond(y,x)

I_xy = sp.mutual_Info(x,y)

H_xy_cross= sp.entropy_cross(x,y)

D_xy= sp.entropy_kld(x,y)


print('Shannan entropy')
print('Entropy of x: H(x) = ',H_x)
print('Entropy of y: H(y) = ',H_y)
print('-')
print('Rényi entropy')
print('Entropy of x: H(x) = ',Hr_x)
print('Entropy of y: H(y) = ',Hr_y)
print('-')
print('Mutual Information I(x,y) = ',I_xy)
print('Joint Entropy H(x,y) = ',H_xy)
print('Conditional Entropy of : H(x|y) = ',H_x1y)
print('Conditional Entropy of : H(y|x) = ',H_y1x)
print('-')
print('Cross Entropy of : H(x,y) = :',H_xy_cross)
print('Kullback–Leibler divergence : Dkl(x,y) = :',D_xy)



plt.figure(figsize=(12,5))
plt.subplot(121)
sp.HistPlot(x,show=False)

plt.subplot(122)
sp.HistPlot(y,show=False)
plt.show()

ICA

View in notebook

from spkit import ICA
from spkit.data import load_data
X,ch_names = load_data.eegSample()

x = X[128*10:128*12,:]
t = np.arange(x.shape[0])/128.0

ica = ICA(n_components=14,method='fastica')
ica.fit(x.T)
s1 = ica.transform(x.T)

ica = ICA(n_components=14,method='infomax')
ica.fit(x.T)
s2 = ica.transform(x.T)

ica = ICA(n_components=14,method='picard')
ica.fit(x.T)
s3 = ica.transform(x.T)

ica = ICA(n_components=14,method='extended-infomax')
ica.fit(x.T)
s4 = ica.transform(x.T)

Machine Learning

Logistic Regression - View in notebook

Naive Bayes - View in notebook

Decision Trees - View in notebook

[source code] | [jupyter-notebook]

Plottng tree while training

**view in repository **

LFSR

import numpy as np
from spkit.pylfsr import LFSR
## Example 1  ## 5 bit LFSR with x^5 + x^2 + 1
L = LFSR()
L.info()
L.next()
L.runKCycle(10)
L.runFullCycle()
L.info()
tempseq = L.runKCycle(10000)    # generate 10000 bits from current state

Contacts:

PhD Student: Queen Mary University of London & University of Genoa


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spkit-0.0.4.tar.gz (32.3 kB view details)

Uploaded Source

Built Distribution

spkit-0.0.4-py3-none-any.whl (31.2 kB view details)

Uploaded Python 3

File details

Details for the file spkit-0.0.4.tar.gz.

File metadata

  • Download URL: spkit-0.0.4.tar.gz
  • Upload date:
  • Size: 32.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for spkit-0.0.4.tar.gz
Algorithm Hash digest
SHA256 ac64999a30d4f3781d557753f6fa1f328195994cc222a8e5cebb73fc087ae967
MD5 a0b1ec0537363cb54c82cf8b8de3eaed
BLAKE2b-256 1b750dd6537be2e8a6de0687c7b218aed0c715c0c85e35bea5e374c56ae55249

See more details on using hashes here.

File details

Details for the file spkit-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: spkit-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 31.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for spkit-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 57ebb18f506add4b9a815f339b51a78081c015e4ae1c35fbf5ba9b418c27aee1
MD5 e00bff5528d4a021818c2deec9e2e082
BLAKE2b-256 dcdc077508c9e7bf70ba82ad03d6c6ca8fe707f977e2275ebdadebc007641c14

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page