A package for processing neural datasets
Project description
Documentation | Join our Discord community
brainsets is a Python package for processing neural data into a standardized format.
Installation
brainsets is available for Python 3.8 to Python 3.11
To install the package, run the following command:
pip install brainsets
List of available brainsets
| brainset_id | Brainset Card | Raw Data Size | Processed Data Size |
|---|---|---|---|
| churchland_shenoy_neural_2012 | Link | 46 GB | 25 GB |
| flint_slutzky_accurate_2012 | Link | 3.2 GB | 151 MB |
| odoherty_sabes_nonhuman_2017 | Link | 22 GB | 26 GB |
| pei_pandarinath_nlb_2021 | Link | 688 KB | 22 MB |
| perich_miller_population_2018 | Link | 13 GB | 2.9 GB |
Acknowledgements
This work is only made possible thanks to the public release of these valuable datasets by the original researchers. If you use any of the datasets processed by brainsets in your research, please make sure to cite the appropriate original papers and follow any usage guidelines specified by the dataset creators. Proper attribution not only gives credit to the researchers who collected and shared the data but also helps promote open science practices in the neuroscience community. You can find the original papers and usage guidelines for each dataset in the brainsets documentation.
Using the brainsets CLI
Configuring data directories
First, configure the directories where brainsets will store raw and processed data:
brainsets config
You will be prompted to enter the paths to the raw and processed data directories.
$> brainsets config
Enter raw data directory: ./data/raw
Enter processed data directory: ./data/processed
You can update the configuration at any time by running the config command again.
Listing available datasets
You can list the available datasets by running the list command:
brainsets list
Preparing data
You can prepare a dataset by running the prepare command:
brainsets prepare <brainset>
Data preparation involves downloading the raw data from the source then processing it,
following a set of rules defined in pipelines/<brainset>/.
For example, to prepare the Perich & Miller (2018) dataset, you can run:
brainsets prepare perich_miller_population_2018 --cores 8
Contributing
If you are planning to contribute to the package, you can install the package in development mode by running the following command:
pip install -e ".[dev]"
Install pre-commit hooks:
pre-commit install
Unit tests are located under test/. Run the entire test suite with
pytest
or test individual files via, e.g., pytest test/test_enum_unique.py
Cite
Please cite our paper if you use this code in your own work:
@inproceedings{
azabou2023unified,
title={A Unified, Scalable Framework for Neural Population Decoding},
author={Mehdi Azabou and Vinam Arora and Venkataramana Ganesh and Ximeng Mao and Santosh Nachimuthu and Michael Mendelson and Blake Richards and Matthew Perich and Guillaume Lajoie and Eva L. Dyer},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file brainsets-0.1.3.tar.gz.
File metadata
- Download URL: brainsets-0.1.3.tar.gz
- Upload date:
- Size: 44.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
347a4484ecf2a617de77276c827fd3f4281987bebe85333c5d0a55a010774b0c
|
|
| MD5 |
d9147d99403a60e74a6498453f243e5d
|
|
| BLAKE2b-256 |
d46e9c43837751069ea51d5bbc0e513c10bd413b014c3f0404fc371cd1ee92a5
|
File details
Details for the file brainsets-0.1.3-py3-none-any.whl.
File metadata
- Download URL: brainsets-0.1.3-py3-none-any.whl
- Upload date:
- Size: 55.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2a5ae6a740aed06dde35d3238c6c43a4ae603ffefb546b209512a323864bd54b
|
|
| MD5 |
8baf9ae14080cd3fefaf4725666a12da
|
|
| BLAKE2b-256 |
66356d9645e043448460ccb54983cabc65037b10d126261e0aeafae3ee09588e
|