Skip to main content

A package for processing neural datasets

Project description

Documentation | Join our Discord community

PyPI version Documentation Status Tests Linting Discord

brainsets is a Python package for processing neural data into a standardized format.

Installation

brainsets is available for Python 3.8 to Python 3.11

To install the package, run the following command:

pip install brainsets

List of available brainsets

brainset_id Brainset Card Raw Data Size Processed Data Size
churchland_shenoy_neural_2012 Link 46 GB 25 GB
flint_slutzky_accurate_2012 Link 3.2 GB 151 MB
odoherty_sabes_nonhuman_2017 Link 22 GB 26 GB
pei_pandarinath_nlb_2021 Link 688 KB 22 MB
perich_miller_population_2018 Link 13 GB 2.9 GB
kemp_sleep_edf_2013 TBA 8.2 GB 60 GB
allen_visual_coding_ophys_2016 TBA 356 GB 58 GB

Acknowledgements

This work is only made possible thanks to the public release of these valuable datasets by the original researchers. If you use any of the datasets processed by brainsets in your research, please make sure to cite the appropriate original papers and follow any usage guidelines specified by the dataset creators. Proper attribution not only gives credit to the researchers who collected and shared the data but also helps promote open science practices in the neuroscience community. You can find the original papers and usage guidelines for each dataset in the brainsets documentation.

Using the brainsets CLI

Configuring data directories

First, configure the directories where brainsets will store raw and processed data:

brainsets config

You will be prompted to enter the paths to the raw and processed data directories.

$> brainsets config
Enter raw data directory: ./data/raw
Enter processed data directory: ./data/processed

You can update the configuration at any time by running the config command again.

Listing available datasets

You can list the available datasets by running the list command:

brainsets list

Preparing data

You can prepare a dataset by running the prepare command:

brainsets prepare <brainset>

Data preparation involves downloading the raw data from the source then processing it, following a set of rules defined in pipelines/<brainset>/.

For example, to prepare the Perich & Miller (2018) dataset, you can run:

brainsets prepare perich_miller_population_2018 --cores 8

Contributing

If you are planning to contribute to the package, you can install the package in development mode by running the following command:

pip install -e ".[dev]"

Install pre-commit hooks:

pre-commit install

Unit tests are located under test/. Run the entire test suite with

pytest

or test individual files via, e.g., pytest test/test_enum_unique.py

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{
    azabou2023unified,
    title={A Unified, Scalable Framework for Neural Population Decoding},
    author={Mehdi Azabou and Vinam Arora and Venkataramana Ganesh and Ximeng Mao and Santosh Nachimuthu and Michael Mendelson and Blake Richards and Matthew Perich and Guillaume Lajoie and Eva L. Dyer},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brainsets-0.2.0.tar.gz (835.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

brainsets-0.2.0-py3-none-any.whl (78.5 kB view details)

Uploaded Python 3

File details

Details for the file brainsets-0.2.0.tar.gz.

File metadata

  • Download URL: brainsets-0.2.0.tar.gz
  • Upload date:
  • Size: 835.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for brainsets-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c195ff9d2bb3396c209f347e0c0cd85f6786d96fae53a75d123bbf5b4e4c6e71
MD5 f31f06dbba27b58f1b6ff917ef2a8c9a
BLAKE2b-256 5e47cbd1eb33ed96798de4d82c95ecb49131ea1d546af773d1a0e5c5046ba8e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for brainsets-0.2.0.tar.gz:

Publisher: publish.yml on neuro-galaxy/brainsets

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file brainsets-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: brainsets-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 78.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for brainsets-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fdcf386eab56caa3f591c5a0ae31f52291163ce4717ef0e8069c2e09634045cc
MD5 c086e9ad8ae9db2ab69c7ba83d8e4514
BLAKE2b-256 79971cbb57467e293ceafac1e66dc9ce1be69309d8dc9e88b853ee17a9324595

See more details on using hashes here.

Provenance

The following attestation bundles were made for brainsets-0.2.0-py3-none-any.whl:

Publisher: publish.yml on neuro-galaxy/brainsets

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page