Skip to main content

Python implementation of the svGPFA algorithm (Duncker and Sahani, 2018)

Project description

Python implementation of Sparse Variational Gaussian Process Factor Analysis (svGPFA, Duncker and Sahani, 2018) tests docs

svGPFA identifies common latent structure in neural population spike-trains. It uses shared latent Gaussian processes, which are combined linearly as in Gaussian Process Factor Analysis (GPFA, Yu et al., 2009). svGPFA extends GPFA to handle unbinned spike-train data by using a continuous time point-process likelihood model and achieving scalability using a sparse variational approximation.

Examples and Documentation

You can run svGPFA on sample data, plot its estimates and perform goodness-of-fit tests (without installing anything in your computer) by just running this Google Colab notebook. You can also do this by installing svGPFA (instructions below) and running this Jupyter notebook. In addition, after installing svGPFA, you can estimate models using a script, as shown in section Testing the installation below.

Documentation can be found here.

Installation

We recommend installing svGPFA in a Python virtual environment.

  1. clone this repo

  2. change the current directory to that of the cloned repo

    cd svGPFA
    
  3. if you will not run the example notebooks (see above), in the root directory of the cloned repo type

    pip install -e .
    

    If you will run the example notebooks (see above), in the root directory of the cloned repo type

    pip install -e .[notebook]
    

Testing the installation

  1. From the root directory of the cloned svGPFA directory, change the current directory to examples/scripts.

    cd examples/scripts
    
  2. run the estimation of svGPFA parameters (for only two EM iterations)

    python doEstimateSVGPFA.py --em_max_iter=2
    
  3. if everything went well the previous script should terminate after showing the following line in the standard output:

    Saved results to results/xxxxxxxx_etimationRes.pickle
    

Citing us

If you use svGPFA, please cite the following paper:

Lea Duncker and Maneesh Sahani (2018). Temporal alignment and latent Gaussian process factor inference in population spike trains. 32nd Conference on Neural Information Processing Systems, Montréal, Canada

@article{duncker2018temporal,
  title={Temporal alignment and latent Gaussian process factor inference in population spike trains},
  author={Duncker, Lea and Sahani, Maneesh},
  journal={Advances in neural information processing systems},
  volume={31},
  year={2018}
}

Development team

  • Joaquin Rapela (Gatsby Computational Neuroscience Unit, University College London)

  • Maneesh Sahani (Gatsby Computational Neuroscience Unit, University College London)

Acknowledgements

The research and development for svGPFA is supported by funding from the Gatsby Charitable Foundation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

svGPFA-1.0.1.tar.gz (57.1 kB view details)

Uploaded Source

Built Distribution

svGPFA-1.0.1-py3-none-any.whl (63.0 kB view details)

Uploaded Python 3

File details

Details for the file svGPFA-1.0.1.tar.gz.

File metadata

  • Download URL: svGPFA-1.0.1.tar.gz
  • Upload date:
  • Size: 57.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.13

File hashes

Hashes for svGPFA-1.0.1.tar.gz
Algorithm Hash digest
SHA256 50b9483fb3b2c8db8635995a188aed39f2e1f2ba5353cf00491300d6e81feebe
MD5 d0aa6b613f10b2c597de0c97a9c53409
BLAKE2b-256 45b7b757d87a37dce3700f143d12d0016cc7a38e8d91f27b84245aa4253c62a5

See more details on using hashes here.

File details

Details for the file svGPFA-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: svGPFA-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 63.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.13

File hashes

Hashes for svGPFA-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ff5d3e34d137f15ee2187d817d58893b492f32148051d22c72df25cef64ba06d
MD5 edf140d3636906bae27ec78b99c04d27
BLAKE2b-256 f04f42c12c4434192a904cec5135905b477c8955bf69ddc705b7c1bd3f51a7da

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page