Skip to main content

Signal Processing Algorithms from MongoDB

Project description

Signal Processing Algorithms

A suite of algorithms implementing E-Divisive with Means and Generalized ESD Test for Outliers in python.

Getting Started - Users

pip install signal-processing-algorithms

Getting Started - Developers

Getting the code:

$ git clone git@github.com:mongodb/signal-processing-algorithms.git
$ cd signal-processing-algorithms

Installation

$ pip install poetry
$ poetry install

Testing/linting:

$ poetry run pytest

Running the slow tests:

$ poetry run pytest --runslow

Some of the larger tests can take a significant amount of time (more than 2 hours).

Intro to E-Divisive

Detecting distributional changes in a series of numerical values can be surprisingly difficult. Simple systems based on thresholds or mean values can be yield false positives due to outliers in the data, and will fail to detect changes in the noise profile of the series you are analyzing.

One robust way of detecting many of the changes missed by other approaches is to use E-Divisive with Means, an energy statistic based approach that compares the expected distance (Euclidean norm) between samples of two portions of the series with the expected distance between samples within those portions.

That is to say, assuming that the two portions can each be modeled as i.i.d. samples drawn from distinct random variables (X for the first portion, Y for the second portion), you would expect the following to be non-zero if there is a sdifference between the two portions:

Where alpha is some fixed constant in (0, 2). This can be calculated empirically with samples from the portions corresponding to X, Y as follows:

Thus for a series Z of length L, we find the most likely change point by solving the following for argmax(τ) (with a scaling factor of mn/(m+n) and α=1 for simplicity):

Multiple Change Points

The algorithm for finding multiple change points is also simple.

Assuming you have some k known change points:

  1. Partition the series into segments between/around these change points.
  2. Find the maximum value of our divergence metric within each partition.
  3. Take the maximum of the maxima we have just found --> this is our k+1th change point.
  4. Return to step 1 and continue until reaching your stopping criterion.

Stopping Criterion

In this package we have implemented a permutation based test as a stopping criterion:

After step 3 of the multiple change point procedure above, randomly permute all of the data within each cluster, and find the most likely change point for this permuted data using the procedure laid out above.

After performing this operation z times, count the number of permuted change points z' that have higher divergence metrics than the change point you calculated with un-permuted data. The significance level of your change point is thus z'/(z+1).

We allow users to configure a permutation tester with pvalue and permutations representing the significance cutoff for algorithm termination and permutations to perform for each test, respectively.

Example

from signal_processing_algorithms.e_divisive import EDivisive
from signal_processing_algorithms.e_divisive.calculators import cext_calculator
from signal_processing_algorithms.e_divisive.significance_test import QHatPermutationsSignificanceTester
from some_module import series

// Use C-Extension calculator for calculating divergence metrics
calculator = cext_calculator
// Permutation tester with 1% significance threshold performing 100 permutations for each change point candidate
tester = QHatPermutationsSignificanceTester(
    calculator=calculator, pvalue=0.01, permutations=100
)
algo = EDivisive(calculator=calculator, significance_tester=tester)

change_points = algo.get_change_points(series)

Interactive Documentation

In addition to the package itself and this readme, we have a set of interactive documents that you can use to recreate experiments and investigations of this package, play with them, and make your own!

The requirement for running these documents are:

  • Docker
  • Docker Compose

Once you have these, simply navigate to $REPO/docs, execute docker-compose up and follow the link!

You can also view these documents in non-interactive form w/o docker+compose:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

signal-processing-algorithms-1.3.4.tar.gz (19.3 kB view details)

Uploaded Source

Built Distributions

signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (26.8 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux1_x86_64.whl (26.8 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

File details

Details for the file signal-processing-algorithms-1.3.4.tar.gz.

File metadata

File hashes

Hashes for signal-processing-algorithms-1.3.4.tar.gz
Algorithm Hash digest
SHA256 c4ae7f3478b96b921ab9934be02cc19bdfbf018f3dd47ee962376a9b92a509f1
MD5 5f386e897fcdc625d8529f49cc1007c6
BLAKE2b-256 fa9ab91012c6f6e7e86b85dbf991dde5c692c3fa3864a2b7ebe28a07c19435dc

See more details on using hashes here.

File details

Details for the file signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 1117ed5879dcc07b870c84b6a846960944603ca82f15911f8956a96352cf607d
MD5 9165f7ea9db25e2ab59688a0f4f098f8
BLAKE2b-256 3f17fe2569a6b8e7f756cc968ad89521144f9026df3fbc5a4f5409dd12dd8d7b

See more details on using hashes here.

File details

Details for the file signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for signal_processing_algorithms-1.3.4-cp37-cp37m-manylinux_2_12_x86_64.manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 116dd8a7ad30ca92e43ef2f9659edb6baf1e7cefc56faf776517a7f21f503a21
MD5 adadf6abb22f5bb1cec7c74aebcd0d71
BLAKE2b-256 e1763cf896dce625e16651cb7eed9cfedfcffb1b1802654abcf99f0e36ba9e8b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page