A pipeline and utils for IMC data analysis.
Project description
Imaging mass cytometry pipeline
This is a pipeline for the processing of imaging mass cytometry (IMC) data.
It is largely based on Vito Zanotelli's pipeline.
It performs image preprocessing and filtering, uses
ilastik
for semi-supervised pixel classification,
CellProfiler
for image segmentation and
quantification of single cells.
The pipeline can be used in standalone mode or with imcrunner
in order to
process multiple samples in a distributed way and in parallel such as a local
computer, on the cloud, or a high performance computing cluster (HPC).
This is due to the use of the light-weight computing configuration manager
divvy.
Requirements and installation
Requires:
- Python >= 3.7
- One of:
docker
,singularity
,conda
orcellprofiler
in a local installation.
Install with:
pip install imcpipeline
Make sure to have an updated PIP version. Development and testing is only done for Linux. If anyone is interested in maintaining this repository in MacOS/Windows fell free to submit a PR.
Quick start
Demo
You can run a demo dataset using the --demo
flag:
imcpipeline --demo
The pipeline will try to use a local cellprofiler
installation, docker
or
singularity
in that order if any is available.
Output files are in a imcpipeline_demo_data
directory.
Running on your data
To run the pipeline on real data, one simply needs to specify input and output
directories. A trained ilastik
model can be provided and if not, the user will
be prompted to train it.
imcpipeline \
--container docker \
--ilastik-model model.ilp \
-i input_dir -o output_dir
If docker
or singularity
is not available, one could for example use a
conda
environment or a virtualenv
environment activated only for the
cellprofiler
command like this:
imcpipeline \
--cellprofiler-exec \
"source ~/.miniconda2/bin/activate && conda activate cellprofiler && cellprofiler"
--ilastik-model model.ilp \
-i input_dir -o output_dir
To run one step only for a single sample, use the -s/--step
argument:
imcpipeline \
--step segmentation \
-i input_dir -o output_dir
Or provide more than one consecutive step in the same way:
imcpipeline \
--step predict,segmentation \
-i input_dir -o output_dir
To run the pipeline for various samples in a specific computing configuration (more details in the documentation):
imcrunner \
--divvy-configuration slurm \
metadata.csv \
--container docker \
--ilastik-model model.ilp \
-i input_dir -o output_dir
Documentation
For additional details on the pipeline, see the documentation.
Related software
- Vito Zanotelli's pipeline;
- A similar pipeline implemented in Nextflow.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file imcpipeline-0.0.5.tar.gz
.
File metadata
- Download URL: imcpipeline-0.0.5.tar.gz
- Upload date:
- Size: 929.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ae8b1f8ded9fc60e8cbfedcf78ba6c48b60efa75b2620b4070a0a505e61826a4 |
|
MD5 | eb8791379b5737a8491f0d7fa0ad326b |
|
BLAKE2b-256 | fb8f1859daa96a4200e40b0fe8c1fb9de972011191088c2db2f8486cd725e8e0 |
File details
Details for the file imcpipeline-0.0.5-py3-none-any.whl
.
File metadata
- Download URL: imcpipeline-0.0.5-py3-none-any.whl
- Upload date:
- Size: 16.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dc6a48a34740464ee9519611c032ed8051d47e7807d741162c4bbedec734a4b9 |
|
MD5 | 5b818c02726bd45586df9938660b41bd |
|
BLAKE2b-256 | 729fd07b5bed9b820554e5ed0d85a1870f19819ff7e1ee4909b73eb6d945777a |