Skip to main content

Adaptive Mixture ICA in Python

Project description

codecov tests docs Ruff

AMICA-Python

A Python implementation of the AMICA (Adaptive Mixture Independent Component Analysis) algorithm for blind source separation, that was originally developed in FORTRAN by Jason Palmer at the Swartz Center for Computational Neuroscience (SCCN).

AMICA-Python is alpha but is correctness tested against the Fortran implementation and is ready for test driving.

Python Fortran

Installation

For now, AMICA-Python should be installed from source, and you will have to manually install PyTorch (see below) yourself:

git clone https://github.com/scott-huberty/amica-python.git
cd amica-python
pip install -e .

[!IMPORTANT] You must install PyTorch before using AMICA-Python.

Installing PyTorch

Depending on your system and preferences, you can install PyTorch with or without GPU support.

To install the standard version of PyTorch, run:

python -m pip install torch

To install the CPU-only version of PyTorch, run:

python -m pip install torch --index-url https://download.pytorch.org/whl/cu113

Or for Conda users:

conda install -c conda-forge pytorch-cpu

[!WARNING] If you are using an Intel Mac, you cannot install Pytorch via pip, because there are no precompiled wheels for that platform. Instead, you must install PyTorch via Conda, e.g.:

conda install pytorch -c conda-forge

If you use UV, you can also just install torch while installing AMICA-Python:

uv pip install -e ".[torch-cpu]"
uv pip install -e ".[torch-cuda]"

Usage

AMICA-Python exposes a scikit-learn style interface. Here is an example of how to use it:

import numpy as np
from scipy import signal
from amica import AMICA


rng = np.random.default_rng(0)
n_samples = 2000
time = np.linspace(0, 8, n_samples)

s1 = np.sin(2 * time)                     # Sinusoidal
s2 = np.sign(np.sin(3 * time))            # Square wave
s3 = signal.sawtooth(2 * np.pi * time)    # Sawtooth

S = np.c_[s1, s2, s3]
S += 0.2 * rng.standard_normal(S.shape)   # Add noise
S /= S.std(axis=0)                        # Standardize

A = np.array([[1, 1, 1],
              [0.5, 2, 1.0],
              [1.5, 1.0, 2.0]])           # Mixing matrix

X = S @ A.T                               # Observed mixtures

ica = AMICA(random_state=0)
X_new = ica.fit_transform(X)
AMICA-Python vs FastICA outputs

GPU acceleration

If PyTorch was installed with CUDA support, you can fit AMICA on GPU:

ica = AMICA(device='cuda', random_state=0)

For more examples and documentation, please see the documentation.

What is AMICA?

AMICA is composed of two main ideas, which are hinted at by the name and the title of the original paper: AMICA: An Adaptive Mixture of Independent Component Analyzers with Shared Components.

1. Adaptive Mixture ICA

Standard ICA assumes each source is independent and non-Gaussian. Extended Infomax ICA improves on this by handling both sub-Gaussian and super-Gaussian sources. AMICA goes further by modeling each source as a mixture of multiple Gaussians. This flexibility lets AMICA represent virtually any source shape - super-Gaussian, sub-Gaussian, or even some funky bimodal distribution:

Source distributions modeled by AMICA

In practice, the authors argue that this leads to a more accurate approximation of the source signals.

2. Shared Components

AMICA can learn multiple ICA decompositions (i.e. models). This is a work around to the assumption of ICA that the sources are stationary (they do not change over time). AMICA will decide which model best explains the data at each sample, effectively allowing the sources to change over time. The "shared components" part of the paper title refers to AMICA's ability to allow the various ICA models to share some components (i.e. sources) between them, to reduce computational load.

What does AMICA-Python implement?

In short, AMICA-Python implements point 1 above (Adaptive Mixture ICA), but does not implement point 2 (running multiple ICA models simultaneously).

AMICA-Python is powered by Torch and wrapped in an easy-to-use scikit-learn style interface.

The outputs are numerically tested against the original FORTRAN implementation to ensure correctness and minimize bugs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

amica_python-0.1.1.tar.gz (61.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

amica_python-0.1.1-py3-none-any.whl (66.4 kB view details)

Uploaded Python 3

File details

Details for the file amica_python-0.1.1.tar.gz.

File metadata

  • Download URL: amica_python-0.1.1.tar.gz
  • Upload date:
  • Size: 61.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.16

File hashes

Hashes for amica_python-0.1.1.tar.gz
Algorithm Hash digest
SHA256 cfcf0e98c1f5db858ca57ec7b1265e517062389c23d236b3bba83dac21855056
MD5 8891b9e3aa11f9e22f029e5a6ce92247
BLAKE2b-256 cdc9c7188f7eb9a70356d5b1a22456e040169697702d32b93b4f2619fbc5cab5

See more details on using hashes here.

File details

Details for the file amica_python-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: amica_python-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 66.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.16

File hashes

Hashes for amica_python-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ccb1898b0689ef13ad7e35a3aef0b3c9e3205b271ad4e12ce428d5e7f3c2b0ba
MD5 3fe13137dca184557f4a81f3b1a7ea71
BLAKE2b-256 ce3d0cc13dc6254e75900253202cc21ef2257955bd4ac3aa52934e0187a45996

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page