Preconditoned ICA for Real Data
Project description
Picard : Preconditioned ICA for Real Data
=========================================
|GHActions|_ |PyPI|_ |Codecov|_ |Downloads|_
.. |GHActions| image:: https://github.com/pierreablin/picard/workflows/unittests/badge.svg?branch=master&event=push
.. _GHActions: https://github.com/pierreablin/picard/actions
.. |Codecov| image:: http://codecov.io/github/pierreablin/picard/coverage.svg?branch=master
.. _Codecov: http://codecov.io/github/pierreablin/picard?branch=master
.. |PyPI| image:: https://badge.fury.io/py/python-picard.svg
.. _PyPI: https://badge.fury.io/py/python-picard
.. |Downloads| image:: http://pepy.tech/badge/python-picard
.. _Downloads: http://pepy.tech/project/python-picard
This repository hosts Python/Octave/Matlab code of the Preconditioned ICA
for Real Data (Picard) and Picard-O algorithms.
See the `documentation <https://pierreablin.github.io/picard/index.html>`_.
Algorithm
---------
Picard is an algorithm for maximum likelihood independent component analysis.
It shows state of the art speed of convergence, and solves the same problems as the widely used FastICA, Infomax and extended-Infomax, faster.
.. image:: comparison.png
:scale: 50 %
:alt: Comparison
:align: center
The parameter `ortho` choses whether to work under orthogonal constraint (i.e. enforce the decorrelation of the output) or not.
It also comes with an extended version just like extended-infomax, which makes separation of both sub and super-Gaussian signals possible.
It is chosen with the parameter `extended`.
* `ortho=False, extended=False`: same solution as Infomax
* `ortho=False, extended=True`: same solution as extended-Infomax
* `ortho=True, extended=True`: same solution as FastICA
* `ortho=True, extended=False`: finds the same solutions as Infomax under orthogonal constraint.
Installation
------------
We recommend the `Anaconda Python distribution <https://www.continuum.io/downloads>`_.
conda
~~~~~
Picard can be installed with `conda-forge <https://conda-forge.org/docs/user/introduction.html>`_.
You need to add `conda-forge` to your conda channels, and then do::
$ conda install python-picard
pip
~~~
Otherwise, to install ``picard``, you first need to install its dependencies::
$ pip install numpy matplotlib numexpr scipy
Then install Picard with pip::
$ pip install python-picard
or to get the latest version of the code::
$ pip install git+https://github.com/pierreablin/picard.git#egg=picard
If you do not have admin privileges on the computer, use the ``--user`` flag
with `pip`. To upgrade, use the ``--upgrade`` flag provided by `pip`.
check
~~~~~
To check if everything worked fine, you can do::
$ python -c 'import picard'
and it should not give any error message.
matlab/octave
~~~~~~~~~~~~~
The Matlab/Octave version of Picard and Picard-O is `available here <https://github.com/pierreablin/picard/tree/master/matlab_octave>`_.
Quickstart
----------
To get started, you can build a synthetic mixed signals matrix:
.. code:: python
>>> import numpy as np
>>> N, T = 3, 1000
>>> S = np.random.laplace(size=(N, T))
>>> A = np.random.randn(N, N)
>>> X = np.dot(A, S)
And then use Picard to separate the signals:
.. code:: python
>>> from picard import picard
>>> K, W, Y = picard(X)
Picard outputs the whitening matrix, K, the estimated unmixing matrix, W, and
the estimated sources Y. It means that Y = WKX
NEW: scikit-learn compatible API
--------------------------------
Introducing `picard.Picard`, which mimics `sklearn.decomposition.FastICA` behavior:
.. code:: python
>>> from sklearn.datasets import load_digits
>>> from picard import Picard
>>> X, _ = load_digits(return_X_y=True)
>>> transformer = Picard(n_components=7)
>>> X_transformed = transformer.fit_transform(X)
>>> X_transformed.shape
Dependencies
------------
These are the dependencies to use Picard:
* numpy (>=1.8)
* matplotlib (>=1.3)
* numexpr (>= 2.0)
* scipy (>=0.19)
These are the dependencies to run the EEG example:
* mne (>=0.14)
Cite
----
If you use this code in your project, please cite::
Pierre Ablin, Jean-Francois Cardoso, Alexandre Gramfort
Faster independent component analysis by preconditioning with Hessian approximations
IEEE Transactions on Signal Processing, 2018
https://arxiv.org/abs/1706.08171
Pierre Ablin, Jean-François Cardoso, Alexandre Gramfort
Faster ICA under orthogonal constraint
ICASSP, 2018
https://arxiv.org/abs/1711.10873
=========================================
|GHActions|_ |PyPI|_ |Codecov|_ |Downloads|_
.. |GHActions| image:: https://github.com/pierreablin/picard/workflows/unittests/badge.svg?branch=master&event=push
.. _GHActions: https://github.com/pierreablin/picard/actions
.. |Codecov| image:: http://codecov.io/github/pierreablin/picard/coverage.svg?branch=master
.. _Codecov: http://codecov.io/github/pierreablin/picard?branch=master
.. |PyPI| image:: https://badge.fury.io/py/python-picard.svg
.. _PyPI: https://badge.fury.io/py/python-picard
.. |Downloads| image:: http://pepy.tech/badge/python-picard
.. _Downloads: http://pepy.tech/project/python-picard
This repository hosts Python/Octave/Matlab code of the Preconditioned ICA
for Real Data (Picard) and Picard-O algorithms.
See the `documentation <https://pierreablin.github.io/picard/index.html>`_.
Algorithm
---------
Picard is an algorithm for maximum likelihood independent component analysis.
It shows state of the art speed of convergence, and solves the same problems as the widely used FastICA, Infomax and extended-Infomax, faster.
.. image:: comparison.png
:scale: 50 %
:alt: Comparison
:align: center
The parameter `ortho` choses whether to work under orthogonal constraint (i.e. enforce the decorrelation of the output) or not.
It also comes with an extended version just like extended-infomax, which makes separation of both sub and super-Gaussian signals possible.
It is chosen with the parameter `extended`.
* `ortho=False, extended=False`: same solution as Infomax
* `ortho=False, extended=True`: same solution as extended-Infomax
* `ortho=True, extended=True`: same solution as FastICA
* `ortho=True, extended=False`: finds the same solutions as Infomax under orthogonal constraint.
Installation
------------
We recommend the `Anaconda Python distribution <https://www.continuum.io/downloads>`_.
conda
~~~~~
Picard can be installed with `conda-forge <https://conda-forge.org/docs/user/introduction.html>`_.
You need to add `conda-forge` to your conda channels, and then do::
$ conda install python-picard
pip
~~~
Otherwise, to install ``picard``, you first need to install its dependencies::
$ pip install numpy matplotlib numexpr scipy
Then install Picard with pip::
$ pip install python-picard
or to get the latest version of the code::
$ pip install git+https://github.com/pierreablin/picard.git#egg=picard
If you do not have admin privileges on the computer, use the ``--user`` flag
with `pip`. To upgrade, use the ``--upgrade`` flag provided by `pip`.
check
~~~~~
To check if everything worked fine, you can do::
$ python -c 'import picard'
and it should not give any error message.
matlab/octave
~~~~~~~~~~~~~
The Matlab/Octave version of Picard and Picard-O is `available here <https://github.com/pierreablin/picard/tree/master/matlab_octave>`_.
Quickstart
----------
To get started, you can build a synthetic mixed signals matrix:
.. code:: python
>>> import numpy as np
>>> N, T = 3, 1000
>>> S = np.random.laplace(size=(N, T))
>>> A = np.random.randn(N, N)
>>> X = np.dot(A, S)
And then use Picard to separate the signals:
.. code:: python
>>> from picard import picard
>>> K, W, Y = picard(X)
Picard outputs the whitening matrix, K, the estimated unmixing matrix, W, and
the estimated sources Y. It means that Y = WKX
NEW: scikit-learn compatible API
--------------------------------
Introducing `picard.Picard`, which mimics `sklearn.decomposition.FastICA` behavior:
.. code:: python
>>> from sklearn.datasets import load_digits
>>> from picard import Picard
>>> X, _ = load_digits(return_X_y=True)
>>> transformer = Picard(n_components=7)
>>> X_transformed = transformer.fit_transform(X)
>>> X_transformed.shape
Dependencies
------------
These are the dependencies to use Picard:
* numpy (>=1.8)
* matplotlib (>=1.3)
* numexpr (>= 2.0)
* scipy (>=0.19)
These are the dependencies to run the EEG example:
* mne (>=0.14)
Cite
----
If you use this code in your project, please cite::
Pierre Ablin, Jean-Francois Cardoso, Alexandre Gramfort
Faster independent component analysis by preconditioning with Hessian approximations
IEEE Transactions on Signal Processing, 2018
https://arxiv.org/abs/1706.08171
Pierre Ablin, Jean-François Cardoso, Alexandre Gramfort
Faster ICA under orthogonal constraint
ICASSP, 2018
https://arxiv.org/abs/1711.10873
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
python-picard-0.7.tar.gz
(63.2 kB
view details)
Built Distribution
File details
Details for the file python-picard-0.7.tar.gz
.
File metadata
- Download URL: python-picard-0.7.tar.gz
- Upload date:
- Size: 63.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8061a1f0c4660c805b7617f2f0bd0284c6e6e84ac3ec782081c67b27cfd185a7 |
|
MD5 | 4adab7a57a5c584e6fbbfb39841e0213 |
|
BLAKE2b-256 | 5b7f9ef17f3c733f517f213a5b30dc8fc3d7d7161f6d9475c748a3cce2504bef |
File details
Details for the file python_picard-0.7-py3-none-any.whl
.
File metadata
- Download URL: python_picard-0.7-py3-none-any.whl
- Upload date:
- Size: 16.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 52643e2d644d9317559da3f3a26eb3661bbd813dd7a7061d39f7084618c8c938 |
|
MD5 | 0b51c9899cf13dd5fd162245037d0766 |
|
BLAKE2b-256 | 5feb9de3b1ad3712700abb702b6ea6590caeeab1bc3bda08d31785121bde159f |