Code for the paper Genetically encoded barcodes for correlative volume electron microscopy
Project description
Source code for "Genetically encoded barcodes for correlative volume electron microscopy"
This repository contains all code for the sequential ML pipeline of the paper Genetically encoded barcodes for correlative volume electron microscopy.
Pretrained models can be found as release artifacts here. They are automatically downloaded and cached when needed by the code.
Installation
Option 1: From PyPI (with pip
)
You can install the project with scripts and all dependencies by running
pip install emcaps
Notes:
-
It is recommended to use a new virtual environment for this.
-
If you encounter PyTorch issues with this setup, please install PyTorch manually following the official instructions
-
If you want to use the napari-based GUI, you will also need to install one of qtpy's supported Qt backends, for example PyQt5:
pip install pyqt5
This has to be done manually since none of the backends is compatible with all platforms and pip can't auto-select an optimal one.
Option 2: From sources (with pip
or conda
)
First obtain the project sources (either clone or download zip and extract) and cd
to the project root.
If you want to install all dependencies and the package itself with pip
, just run
pip install .
Alternatively, if you want to install the dependencies with conda
, run the following commands:
conda env create -f environment.yml
conda activate emcaps
pip install .
Running the code
All scripts can be executed from the project root directory using python3 -m
, for example:
$ python3 -m emcaps.inference.segment -h
Alternatively you can use the entry points provided by the pip installation:
$ emcaps-segment -h
Entry points for testing on custom data
These entry points just require raw images and don't require GPU resources. Labels are not needed.
Napari-based interactive GUI tool for segmentation and EMcapsulin particle classification
$ emcaps-encari
or
$ python3 -m emcaps.analysis.encari
Performing batch inference on a directory of images or single image files
$ emcaps-segment segment.inp_path=<PATH_TO_FILE_OR_FOLDER>
or
$ python3 -m emcaps.inference.segment segment.inp_path=<PATH_TO_FILE_OR_FOLDER>
Entry points for reproduction, retraining or evaluation
The following steps require a local copy of the official dataset or a dataset in the same structure. A GPU is highly recommended.
Splitting labeled image dataset into training and validation images and normalizing the data format
$ emcaps-splitdataset
or
$ python3 -m emcaps.utils.splitdataset
Training new segmentation models
$ emcaps-segtrain
or
$ python3 -m emcaps.training.segtrain
Segmentation inference and evaluation
Segment and optionally also perform particle-level classification if a model is available, render output visualizations (colored classification overlays etc.) and compute segmentation metrics.
$ emcaps-segment
or
$ python3 -m emcaps.inference.segment
For a usage example featuring config sweeps, see _scripts/seg_cls_test.sh
Producing a patch dataset based on image segmentation
Based on segmentation (from a model or human annotation), extract particle-centered image patches and store them as separate files in addition to metadata. The resulting patch dataset can be used for training models for patch-based classification. In addition, A random sample of the validation patches is prepared for evaluation of human and model-based classification evaluation.
$ emcaps-patchifyseg
or
$ python3 -m emcaps.inference.patchifyseg
Training new patch classifiers
Requires the outputs of patchifyseg
(see above).
$ emcaps-patchtrain
or
$ python3 -m emcaps.training.patchtrain
Quantitative evaluation of patch classification results
Requires the outputs of patchifyseg
(see above).
$ emcaps-patcheval
or
$ python3 -m emcaps.inference.patcheval
For a usage example featuring config sweeps, see _scripts/patcheval.sh
Rendering average images of patch collections and grouping patches by EMcapsulin types
Requires the outputs of patchifyseg
(see above).
$ emcaps-averagepatches
or
$ python3 -m emcaps.analysis.averagepatches
Configuration system
We are using a common configuration system for the runnable code, based on Hydra and OmegaConf.
A central default config file with explanatory comments is located at conf/conf.yaml
.
It is written to be as automatic and minimal as possible, but it can still be necessary to change some of the values for experiments or adapting to a different system.
For the syntax of such yaml-based config files please refer to the OmegaConf docs on access and manipulation and variable interpolation
For running hydra-enabled code with custom configuration you can either point to a different config file with the -cp
CLI flag or change config values directly on the CLI using Hydra's override syntax
Dataset
If you want to train own models and/or do quantitative evaluation on the official data, please find the data here and extract it to ~/emc/emcapsulin
.
Further notes
For more details see top-level docstrings in each file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file emcaps-1.0.0.tar.gz
.
File metadata
- Download URL: emcaps-1.0.0.tar.gz
- Upload date:
- Size: 452.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 297274be95040ed00aed2022b909f375c642d8209a2934fac13d4ff76f2f2a69 |
|
MD5 | 86f7df441f7ee5ecf5e217edeaa26487 |
|
BLAKE2b-256 | eabba6d3ba402ee07a2b24e7b2c3b397db1f282e2da4c35d9c4b59d64e0162e2 |
File details
Details for the file emcaps-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: emcaps-1.0.0-py3-none-any.whl
- Upload date:
- Size: 65.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 116196db776cfe3a0139aeb6b36b21a9397bfdbcb930fd40a27a1a8d94d9bc6b |
|
MD5 | 012849ca5fa3193fa5ba9b4d8afac482 |
|
BLAKE2b-256 | f44ab54d3a32f5f8bad48d4dcf5d0d6e73e32b630c20202e4d20c16b4340defd |