Skip to main content

A package for processing neural datasets

Project description

Documentation | Join our Discord community

PyPI version Documentation Status Tests Linting Discord

brainsets is a Python package for processing neural data into a standardized format.

Installation

brainsets is available for Python 3.8 to Python 3.11

To install the package, run the following command:

pip install brainsets

List of available brainsets

brainset_id Brainset Card Raw Data Size Processed Data Size
churchland_shenoy_neural_2012 Link 46 GB 25 GB
flint_slutzky_accurate_2012 Link 3.2 GB 151 MB
odoherty_sabes_nonhuman_2017 Link 22 GB 26 GB
pei_pandarinath_nlb_2021 Link 688 KB 22 MB
perich_miller_population_2018 Link 13 GB 2.9 GB

Acknowledgements

This work is only made possible thanks to the public release of these valuable datasets by the original researchers. If you use any of the datasets processed by brainsets in your research, please make sure to cite the appropriate original papers and follow any usage guidelines specified by the dataset creators. Proper attribution not only gives credit to the researchers who collected and shared the data but also helps promote open science practices in the neuroscience community. You can find the original papers and usage guidelines for each dataset in the brainsets documentation.

Using the brainsets CLI

Configuring data directories

First, configure the directories where brainsets will store raw and processed data:

brainsets config

You will be prompted to enter the paths to the raw and processed data directories.

$> brainsets config
Enter raw data directory: ./data/raw
Enter processed data directory: ./data/processed

You can update the configuration at any time by running the config command again.

Listing available datasets

You can list the available datasets by running the list command:

brainsets list

Preparing data

You can prepare a dataset by running the prepare command:

brainsets prepare <brainset>

Data preparation involves downloading the raw data from the source then processing it, following a set of rules defined in pipelines/<brainset>/.

For example, to prepare the Perich & Miller (2018) dataset, you can run:

brainsets prepare perich_miller_population_2018 --cores 8

Contributing

If you are planning to contribute to the package, you can install the package in development mode by running the following command:

pip install -e ".[dev]"

Install pre-commit hooks:

pre-commit install

Unit tests are located under test/. Run the entire test suite with

pytest

or test individual files via, e.g., pytest test/test_enum_unique.py

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{
    azabou2023unified,
    title={A Unified, Scalable Framework for Neural Population Decoding},
    author={Mehdi Azabou and Vinam Arora and Venkataramana Ganesh and Ximeng Mao and Santosh Nachimuthu and Michael Mendelson and Blake Richards and Matthew Perich and Guillaume Lajoie and Eva L. Dyer},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brainsets-0.1.2.tar.gz (44.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

brainsets-0.1.2-py3-none-any.whl (55.9 kB view details)

Uploaded Python 3

File details

Details for the file brainsets-0.1.2.tar.gz.

File metadata

  • Download URL: brainsets-0.1.2.tar.gz
  • Upload date:
  • Size: 44.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for brainsets-0.1.2.tar.gz
Algorithm Hash digest
SHA256 9c2d7e45aafa48b29726ee47c9c65dac233b26a432ac09b7a4b2356fe1a15735
MD5 030a389ce30c0851c38b0014cfc4f06b
BLAKE2b-256 3aa9e4a50753c3b63515c89732b5892a8e5b7d11a42383f3d0f6a77482ffca11

See more details on using hashes here.

File details

Details for the file brainsets-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: brainsets-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 55.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for brainsets-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6365b98e0d790b96f1597d23f94aaf1e4b3a491c512151c5532d827b6f9dcee1
MD5 1bd09b28318032f64fc36f26605849cf
BLAKE2b-256 74bba7b92d8d6213c7ff32ab410c385257cd33afdf85999da0492f1b004b9618

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page