No project description provided
Project description
cellular-Automated Annotation Pipeline
Utilities for the semi-automated generation of instance segmentation annotations to be used for neural network training. Utilities are built ontop of UMAP, HDBSCAN and a finetuned encoder version of FAIR's Segment Anything Model developed by Computational Cell Analytics for the project micro-sam. In addition to providing utilies for annotation building, we train a network, FAIR's detectron2 to
- Demonstrate the efficacy of our utilities.
- Be used for microscopy annotation of supported cell lines
Supported cell lines currently include:
- HeLa
In development cell lines currently include:
- U2OS
- HT1080
- Yeast
We've developed a napari application for the usage of this pre-trained network and propose a transfer learning schematic for the handling of new cell lines.
Installation
We highly recommend installing cell-AAP in a clean conda environment. To do so you must have miniconda or anaconda installed.
If a conda distribution has been installed:
-
Create and activate a clean environment
conda create -n cell-aap-env conda activate cell-app-env
-
Within this enviroment install pip
conda install pip
-
Then install cell-AAP from PyPi
pip install cell-AAP --upgrade
-
Finally detectron2 must be built from source, atop cell-AAP
#For MacOS CC=clang CXX=clang++ ARCHFLAGS="-arch arm64" python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' #For other operating systems python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
Napari Plugin Usage
- To open napari simply type "napari" into the command line, ensure that you are working the correct environment
- To instantiate the plugin navigate to the "Plugins" menu and select "cell-AAP"
- You should now see the Plugin, where you can select an image, display it, and run inference on it.
Configs Best Practices
If running inference on large volumes of data, i.e. timeseries data >= 300 MB in size, we recommed to procceed in the following manner.
- Assemble a small, < 100 MB, substack of your data using python or a program like ImageJ
- Use this substack to find the optimal parameters for your data, (Number of Cells, Confidence)
- Run Inference over the volume using the discovered optimal parameters
Note: Finding the optimal set of parameters requires some trial and error, to assist we've created a table.
Classifications $\Downarrow$ Detections $\Rightarrow$ | Too few | Too many |
---|---|---|
Dropping M-phase | Confidence $\Downarrow$ Number of Cells $\Uparrow$ |
Confidence $\Downarrow$ Number of cells $\Downarrow$ |
Missclasifying M-phase | Confidence $\Uparrow$ Number of Cells $\Uparrow$ |
Confidence $\Uparrow$ Number of Cells $\Downarrow$ |
Intepreting Results
Once inference is complete the following colors indicate class prediction
- Red: Non-mitotic
- Blue: Mitotic
- Purple: Interclass double prediction
Note: Interclass double predictions are often early prophase cells that the network is not "confident" in, to mitigate such predictions increase the minimum confidence threshold. This will typically result in most double predictions regressing to the Non-mitotic class.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cell_aap-0.0.7.tar.gz
.
File metadata
- Download URL: cell_aap-0.0.7.tar.gz
- Upload date:
- Size: 17.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b9313aa301b7ef2d29909a4f50eac5d0a1a16e9a00a28410f79dec58d9aec602 |
|
MD5 | d3397d882abe87b0fd0f67c7d0b40826 |
|
BLAKE2b-256 | a478103c4e50de1598a42300e2e5ba5dbb49ce442a0e0afc32de9d070c723f48 |
File details
Details for the file cell_AAP-0.0.7-py3-none-any.whl
.
File metadata
- Download URL: cell_AAP-0.0.7-py3-none-any.whl
- Upload date:
- Size: 25.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e422f454182d9ad35402302db3912c52124af519dd4acfe8c855422d5b380fd8 |
|
MD5 | f30537ad5ffc2a3db433628b68669332 |
|
BLAKE2b-256 | fc0f1d5322aa799d9f3d9e8149cdd5f938fffc1c7b84b50eb5cb4f1960839f10 |