a CNN for anatomy-guided deconvolution and denoising of PET images
Project description
pyapetnet
A convolutional neurol network (CNN) to mimick the behavior of anatomy-guided PET reconstruction in image space.
Authors
Georg Schramm, David Rigie
License
This project is licensed under the MIT License - see the LICENSE file for details
Scientific Publication
Details about pyapetnet are published in Schramm et al., "Approximating anatomically-guided PET reconstruction in image space using a convolutional neural network" ,NeuroImage Vol 224 2021. If we you are using pyapetnet in scientific publications, we appreciate citation of this article.
Installation
We recommend to use the anaconda python distribution and to create a conda virtual environment for pyapetnet.
The installation consists of three steps:
- Installation of anaconda (miniconda) python distribution
- Creation of the conda virtual environment with all dependencies
- Installation of the pyapetnet package using pip
Installation of anaconda (miniconda)
Download and install Miniconda from https://docs.conda.io/en/latest/miniconda.html.
Please use the Python 3.x installer and confirm that the installer
should run conda init
at the end of the installtion process.
To test your miniconda installtion, open a new terminal and execute
conda list
which should list the installed basic python packages.
Creation of the virtual conda environment
To create a virtual conda python=3.8 environment execute
conda create -n pyapetnet python=3.8 ipython
You can also you a newer version of python, if supported by tensorflow. To test the installation of the virual environment, execute
conda activate pyapetnet
Installation of the pyapetnet package
Activate the virual conda environment
conda activate pyapetnet
The easiest is to install pyapetnet simply from the python package index via
pip install pyapetnet
which will install the pyapetnet package inside the virtual conda environment.
To test the installation run (inside python or ipython)
import pyapetnet
print(pyapetnet.__version__)
print(pyapetnet.__file__)
If the installation was successful, a number of command line scripts all starting with pyapetnet* to e.g. do predictions with the included trained models from nifti and dicom input images will be available.
Getting started - running your first prediction with pre-trained models
To run a prediction using one of included pre-trained networks and nifti images, run e.g.:
pyapetnet_predict_from_nifti osem.nii t1.nii S2_osem_b10_fdg_pe2i --show
Use the following to get information on the (optional) input arguments
pyapetnet_predict_from_nifti -h
To get a list of available pre-trained models run
pyapetnet_list_models
To make predictions from dicom images, use
pyapetnet_predict_from_dicom osem_dcm_dir t1_dcm_dir S2_osem_b10_fdg_pe2i --show
The source code of the prediction scripts can be found in the command_line_tools sub module.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for pyapetnet-1.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b53b6aa8c26e7ad1d24cf3e4abba432d352f37f8a91ce3ba4508be9ae5c33374 |
|
MD5 | d85ef079712d225e1df836217d8051e3 |
|
BLAKE2b-256 | ce76913e57ea91840c95dcd68a1d3ef134e94dd39e05e06d7262a9a0671a706b |