Skip to main content

The Precomputed Functional Connectome Toolkit

Project description

PyPI version shields.io

pfc-toolkit

The Precomputed Functional Connectome Toolkit (pfc-toolkit) is a Python module for functional lesion network mapping using the Precomputed Functional Connectome and is distributed under the 3-Cause BSD license.

This project is also known as the Precomputed Human Brain Connectome Toolkit.

The project was started in 2019 by William Drew for his Harvard undergraduate thesis project ('21).

The project was supervised by Dr. Michael D. Fox, MD, PhD first at the Berenson-Allen Center for Noninvasive Brain Stimulation at Beth Israel Deaconess Medical Center and later at the Center for Brain Circuit Therapeutics at Brigham and Women's Hospital.

This project references work from @clin045's and @alexlicohen's connectome_quick.py from nimlab and @andreashorn's connectome.sh from LeadDBS.

The Precomputed Functional Connectome Toolkit is for research only; please do not use results from PFC Toolkit for clinical decisions.

Installation

Dependencies

pfc-toolkit requires:

  • Python (>=3.6)
  • NumPy
  • SciPy
  • Numba
  • Nibabel
  • Nilearn
  • tqdm
  • natsort
  • importlib

User Installation

Install using pip

pip install pfc-toolkit

Usage

Generate Functional Lesion Network Maps

  1. Create a folder named pfctoolkit_config in your home directory.
mkdir ~/pfctoolkit_config
  1. Set up your precomputed connectome config file. An example config file is located here. Make a copy of the config file to the pfctoolkit_config folder. Edit the config file and swap out the precomputed connectome paths to where you have downloaded the connectome files on your machine.

  2. Run the precomputed connectome script. If your precomputed connectome config file is named yeo1000_dil.json, the name of your precomputed connectome config is yeo1000_dil.

connectome_precomputed -r <path to directory containing rois> -c <name of precomputed connectome config> -o <output directory>

Generate a Precomputed Connectome (Instructions WIP)

If instead of using the provided precomputed connectome, you would like to generate your own, first you need preprocessed BOLD fMRI timecourse data from a set of subjects. Each subject's preprocessed data must contain the same number of timepoints, must be registered to the same MNI152 space, and must be masked with the same MASK. If you are not using the default MNI152_T1_2mm_brain_mask_dil.nii.gz mask file included with FSL, please use pfctoolkit.chunker.generate_chunk_mask to generate a chunk mask file chunk_idx.nii.gz.

Once you have such data, for each subject please create two .npy files.

For a subject with ID SUB001, the first .npy file should be named SUB001.npy and contain the preprocessed BOLD fMRI timecourse data in a numpy array with shape (n_timepoints, n_voxels). The order of the voxels should be determined with nilearn.maskers.NiftiMasker.

The second .npy file should be named SUB001_norms.npy and contain the norms of each voxel's BOLD fMRI timecourse signal in a numpy array with shape (n_voxels,) in the same order as in the first .npy file.

After you have done this for all subjects in your dataset, please place all .npy files in a single "connectome" folder.

Next, to generate functional connectivity chunks for the precomputed connectome, run the following script for each chunk index i:

generate_pfc_fc_chunks -b <name of brain mask> -c <path to chunk_idx.nii.gz> -i <index of chunk to process, i> -cs <path to connectome directory containing .npy files> -o <path to output directory>

Please see additional usage instructions with generate_pfc_fc_chunks -h.

Next, to generate combo chunks for the precomputed connectome, run the following script for each chunk index i:

generate_pfc_combo_chunks -b <name of brain mask> -c <path to chunk_idx.nii.gz> -i <index of chunk to process, i> -cs <path to connectome directory containing .npy files> -o <path to output directory>

Please see additional usage instructions with generate_pfc_combo_chunks -h.

Lastly, to generate BOLD timecourse norm and standard-deviation weighted masks for the precomputed connectome, run the following script for each chunk index i:

generate_pfc_weighted_masks -b <name of brain mask> -cs <path to connectome directory containing .npy files> -n <name of precomputed connectome> -o <path to output directory>

Please see additional usage instructions with generate_pfc_weighted_masks -h.

Development

Source code

You can check the latest sources with the command:

git clone https://github.com/thewilliamdrew/pfc-toolkit.git

Help and Support

Documentation

Documentation is located here. (WIP)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pfc-toolkit-2024.2.15.1.tar.gz (3.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pfc_toolkit-2024.2.15.1-py3-none-any.whl (3.8 MB view details)

Uploaded Python 3

File details

Details for the file pfc-toolkit-2024.2.15.1.tar.gz.

File metadata

  • Download URL: pfc-toolkit-2024.2.15.1.tar.gz
  • Upload date:
  • Size: 3.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.6

File hashes

Hashes for pfc-toolkit-2024.2.15.1.tar.gz
Algorithm Hash digest
SHA256 428f7550686c3d92297cfe6f0861da101d2ee4245d8c3bef9bd427ea8e3ca885
MD5 5f52df33154d8d63557feb065dff420f
BLAKE2b-256 fff81a3299872eb25222f5685aeb4fa1f92dac9031f3ada151882379d1f05915

See more details on using hashes here.

File details

Details for the file pfc_toolkit-2024.2.15.1-py3-none-any.whl.

File metadata

File hashes

Hashes for pfc_toolkit-2024.2.15.1-py3-none-any.whl
Algorithm Hash digest
SHA256 34fe04ffaa90f16973c7f78c5604db264db0b64611f0fa677420a9cd443b9909
MD5 96687a3ba99dc04b392bc31c28a68993
BLAKE2b-256 6e4dc96a70384fbda95022b925f549c18cf713606871ad4d92b412da0ecb0404

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page