Skip to main content

analysis tools for neural population recordings

Project description

neuropop

analysis tools for large-scale neural recordings

Copyright (C) 2023 Howard Hughes Medical Institute Janelia Research Campus, the labs of Carsen Stringer and Marius Pachitariu

This code is licensed under GPL v3 (no redistribution without credit, and no redistribution in private repos, see license for more details).

references

[1] Stringer, C.*, Pachitariu, M.*, Steinmetz, N., Carandini, M., & Harris, K. D. (2019). High-dimensional geometry of population responses in visual cortex. Nature, 571(7765), 361-365.

[2] Stringer, C.*, Pachitariu, M.*, Steinmetz, N., Reddy, C. B., Carandini, M., & Harris, K. D. (2019). Spontaneous behaviors drive multidimensional, brainwide activity. Science, 364(6437), eaav7893.

[3] Syeda, A., Zhong, L., Tung, R., Long, W., Pachitariu, M.*, & Stringer, C.* (2022). Facemap: a framework for modeling neural activity based on orofacial tracking. bioRxiv.

dimensionality.py

This module contains code for dimensionality estimation methods for neural data:

  • Cross-validated PCA (described in [1]) is for estimating the dimensionality of neural stimulus responses where each stimulus is shown at least twice. Divide your data into two repeats -- a matrix of 2 x stimuli x neurons, and input it into the function cvPCA to obtain the cross-validated eigenspectrum. Note that each repeat can be the average of several stimulus responses (e.g. 5-10 each). You can then use the get_powerlaw function to estimate the exponent of the decay of the eigenspectrum. If you use these functions please cite [1].
  • Shared variance components analysis (described in [2]) is for estimating the dimensionality of neural activity that is shared across neurons (excluding single neuron variability). This method divides the neurons in half and the timepoints in half into training and testing. Then it computes the principal components of the covariance between the two neural halves on the training timepoints, and then computing the variance of those components on the testing timepoints. Take your neural data as a matrix of neurons x time and input it into the function SVCA to obtain the variance of each component of the covariance matrix on the test set scov (this had a powerlaw decay of ~1.1 in [2]). The function also returns the average variance of each component in each neural half varcov. If you use this function please cite [2].

peer_prediction.py

Prediction of one half of neurons from the other half of neurons, in order to estimate an upper bound for the amount of predictable variance in the neural population. If you use this function, please cite [2].

linear_prediction.py

This module contains code for ridge regression and regularized reduced rank regression, particularly for predicting from behavior to neural activity. The main function is prediction_wrapper, if rank is None then ridge regression is performed, otherwise reduced rank regression is performed. This function assumes you have pytorch with GPU support, otherwise set device=torch.device('cpu'). CCA is also implemented without GPU support. If you use these functions please cite [2].

nn_prediction.py

This module contains code for non-linear prediction of neural activity from behavior, as described in [3]. The main function is network_wrapper, this function assumes you have pytorch with GPU support, otherwise set device=torch.device('cpu'), and it also assumes you have taken the principal components of the data U, e.g.:

# z-score neural activity
spks -= spks.mean(axis=1)[:, np.newaxis]
std = ((spks**2).mean(axis=1) ** 0.5)[:, np.newaxis]
std[std == 0] = 1
spks /= std

# compute principal components
from sklearn.decomposition import PCA
Y = PCA(n_components=128).fit_transform(spks.T)
U = spks @ Y
U /= (U**2).sum(axis=0) ** 0.5

# predict Y from behavior variables x (z-score x if using keypoints)
# tcam are camera/behavior timestamps, tneural are neural timestamps
varexp, varexp_neurons, spks_pred_test0, itest, model = network_wrapper(x, Y, tcam, tneural, U, spks, delay=-1, verbose=True)

If you use these functions please cite [3].

future_prediction.py

This contains functions for predicting behavioral or neural variables into the future using ridge regression with exponential basis functions (see Figure 1 in [3]). If you use these functions please cite [3].

requirements

This package relies on the following excellent packages (recommended to cite these in your work as well if you use them):

  • numpy
  • scipy
  • scikit-learn
  • pytorch

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuropop-1.0.1.tar.gz (562.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuropop-1.0.1-py3-none-any.whl (32.7 kB view details)

Uploaded Python 3

File details

Details for the file neuropop-1.0.1.tar.gz.

File metadata

  • Download URL: neuropop-1.0.1.tar.gz
  • Upload date:
  • Size: 562.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for neuropop-1.0.1.tar.gz
Algorithm Hash digest
SHA256 66db0f82a6821a984a5f4806a0309b2f5aa06d120079afa30aa9fd3d2fe3d08c
MD5 305721e4d5a892b5c768ca6f4120b740
BLAKE2b-256 f37f22b7aa12f7f44ef1c768dc3478380f19cfd2c45c3e3d27ee39d0f8809da2

See more details on using hashes here.

File details

Details for the file neuropop-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: neuropop-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 32.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for neuropop-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 34ec58ff109fdc4143b6218fb8f96f7a4016d18bbb059735a190d87e252ee042
MD5 9319445859b71e9346f04f74b7a25b95
BLAKE2b-256 f0e68edadf71005bdb82f57630f460cf6aaa63a1267de19d6dfcfd228dccf731

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page