Skip to main content

A python library for converting PET imaging and blood data to BIDS.

Project description

PET2BIDS is a code library to convert source Brain PET data to BIDS

python Matlab PET2BIDS Tests Documentation Status phantoms

This repository is hosting tools to curate PET brain data using the Brain Imaging Data Structure Specification. The work to create these tools is funded by Novo Nordisk Foundation (NNF20OC0063277) and the BRAIN initiative (MH002977-01).

For DICOM image conversion, we rely on dcm2niix, collaborating with Prof. Chris Rorden without whom we could not convert your data! For more information on dcm2niix and nifti please see The first step for neuroimaging data analysis: DICOM to NIfTI conversion paper.

Documentation

For more detailed (and most likely helpful) documentation visit the Read the Docs site for this project at:

https://pet2bids.readthedocs.io

Installation

Simply download the repository - follow the specific Matlab or Python explanations. Matlab and Python codes provide the same functionalities.

matlab

asciicast

  1. remember to set the path to the PET2BIDS/matlab folder, you will find the source code to use here.
  2. if converting DICOM files, make sure you have dcm2niix (for windows users, edit dcm2niix4pet.m to set the right paths to the .exe)
  3. start using the code! more info here

pypet2bids

Use pip to install this library directly from PyPI:

asciicast

If you wish to install directly from this repository see the instructions below to either build a packaged version of pypet2bids or how to run the code from source.

Build Package Locally and Install with PIP

We use poetry to build this package, no other build methods are supported, further we encourage the use of GNU make and a bash-like shell to simplify the build process.

After installing poetry, you can build and install this package to your local version of Python with the following commands (keep in mind the commands below are executed in a bash-like shell):

cd PET2BIDS
cp -R metadata/ pypet2bids/pypet2bids/metadata
cp pypet2bids/pyproject.toml pypet2bids/pypet2bids/pyproject.toml
cd pypet2bids && poetry lock && poetry build
pip install dist/pypet2bids-X.X.X-py3-none-any.whl

Why is all the above required? Well, because this is a monorepo and we just have to work around that sometimes.

[!NOTE] Make and the additional scripts contained in the scripts/ directory are for the convenience of non-windows users.

If you have GNU make installed and are using a bash or something bash-like in you your terminal of choice, run the following:

cd PET2BIDS
make installpoetry buildpackage installpackage
Run Directly From Source

Lastly, if one wishes run pypet2bids directly from the source code in this repository or to help contribute to the python portion of this project or any of the documentation they can do so via the following options:

cd PET2BIDS/pypet2bids
poetry install

Or they can install the dependencies only using pip:

cd PET2BIDS/pypet2bids
pip install .

After either poetry or pip installation of dependencies modules can be executed as follows:

cd PET2BIDS/pypet2bids
python dcm2niix4pet.py --help

Note: We recommend using dcm2niix v1.0.20220720 or newer; we rely on metadata included in these later releases. It's best to collect releases from the rorden lab/dcm2niix/releases page. We have observed that package managers such as yum or apt or apt-get often install much older versions of dcm2niix e.g. v1.0.2017XXXX, v1.0.2020XXXXX. You may run into invalid-BIDS or errors with this software with older versions.

spreadsheet_conversion (custom and pmod)

This folder contains spreadsheets templates and examples of metadata and matlab and python code to convert them to json files. Often, metadata such as Frame durations, InjectedRadioactivity, etc are stored in spreadsheets and we have made those function to create json files automatically for 1 or many subjects at once to go with the nifti imaging data. Note, we also have conversion for pmod files (also spreadsheets) allowing to export to blood.tsv files.

metadata

A small collection of json files for our metadata information.

user metadata

No matter the way you prefer inputting metadata (passing all arguments, using txt or env file, using spreadsheets), you are always right! DICOM values will be ignored - BUT they are checked and the code tells you if there is inconsistency between your inputs and what DICOM says.

ecat_validation

This folder contains code generating Siemens HRRT scanner data using ecat file format and validating the matlab and python conversion tools (i.e. giving the data generated as ecat, do our nifti images reflect accurately the data).

Citation

Please cite us when using PET2BIDS.

Contribute

Anyone is welcome to contribute ! check here how you can get involved, the code of conduct. Contributors are listed here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pypet2bids-1.4.3.tar.gz (197.0 kB view details)

Uploaded Source

Built Distribution

pypet2bids-1.4.3-py3-none-any.whl (205.6 kB view details)

Uploaded Python 3

File details

Details for the file pypet2bids-1.4.3.tar.gz.

File metadata

  • Download URL: pypet2bids-1.4.3.tar.gz
  • Upload date:
  • Size: 197.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.11.0 Darwin/24.5.0

File hashes

Hashes for pypet2bids-1.4.3.tar.gz
Algorithm Hash digest
SHA256 31865c964e85e9ac9276c488b8982a1e9fdefb1757e900cf636a54dca2923980
MD5 21c8a2d1706b0a3d158703c6109986bb
BLAKE2b-256 495fb1b7a5a85bae9a6a48e240ad2abfd5a406def64789a9a65c79591045b814

See more details on using hashes here.

File details

Details for the file pypet2bids-1.4.3-py3-none-any.whl.

File metadata

  • Download URL: pypet2bids-1.4.3-py3-none-any.whl
  • Upload date:
  • Size: 205.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.11.0 Darwin/24.5.0

File hashes

Hashes for pypet2bids-1.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 19068bf600544ad230457d1b1e2fb220a0bb7b9c310bd858cc7eb64b1f1bd7d0
MD5 2f4a1e1111207feece2d948b5c4a3c3c
BLAKE2b-256 0eb50437e3beed13ae1f70a017a09be6cb74a8f18109e43efd0e1c31bf024c49

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page