Skip to main content

A deep learning oriented microscopy image simulation package

Project description

DeepTrack is a comprehensive deep learning framework for digital microscopy. We provide tools to create physical simulations of customizable optical systems, to generate and train neural network models, and to analyze experimental data.

Note!

This branch is a developmental branch preparing for the next major release. While the branch should be working at any point in time, many features may be developmental and are subject to change.

Roadmap

New Features!

  • Full GPU support.
    • Infer cupy support during import.
    • Allow separate GPU/CPU methods for optimization.
    • Integrated benchmarking.
    • Allow forcefully overriding CPU/GPU state.
    • Disable/enable GPU using environment variables.
  • Simulation-parameters optimization
    • Grid search
    • Genetic optimization
    • Independent variables assumption
  • Architecure searches.
  • Common architectures with pre-trained weights.
  • Label free training of particle tracking models
  • Particle tracing / linking
  • Export to DeepImageJ

Usage improvements

  • Move from .resolve() to __call__ as primary evalution method.
    • Features can be passed as properties, and will be resolved with no input.
    • Implement way to bypass the property evaluation (better than wrapping in lambda).
  • Facilitate the construction of compound shapes.
    • Separate Scatterers into Geometry and Scatterers
    • Compund shape using dt.Scatter.
    • Implement __sub__, which sets everything within it to 0.
    • All optics wraps the sample as a dt.Scatter, which produces a volume.
  • Allow for modular creation of optics pipeline.
    • Separately define the pupil, the input illumination, the simulation method and the sample.
    • Test simulations against theory to ensure that they work as expected.
  • Deprecate * shorthand for probability and instead use it as shorthand for multiplication.

Misc

  • Implement rigorous and transparent error handling.
  • Expand and standardize unittests.
  • Better utilize continuous integration.
  • Expand documentation with examples of each feature.

Getting started

Installation

DeepTrack 2.0 requires at least python 3.6

To install DeepTrack 2.0, open a terminal or command prompt and run

pip install deeptrack

Learning DeepTrack 2.0

Everybody learns in different ways! Depending on your preferences, and what you want to do with DeepTrack, you may want to check out one or more of these resources.

Fundamentals

First, we have a very general walkthrough of basic and advanced topics. This is a 5-10 minute read, that well get you well on your way to understand the unique interactions available in DeepTrack.

DeepTrack 2.0 in action

To see DeepTrack in action, we provide six well documented tutorial notebooks that create simulation pipelines and train models:

  1. deeptrack_introduction_tutorial gives an overview of how to use DeepTrack 2.0.
  2. tracking_particle_cnn_tutorial demonstrates how to track a point particle with a convolutional neural network (CNN).
  3. tracking_multiple_particles_unet_tutorial demonstrates how to track multiple particles using a U-net.
  4. characterizing_aberrations_tutorial demonstrates how to add and characterize aberrations of an optical device.
  5. distinguishing_particles_in_brightfield_tutorial demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
  6. analyzing_video_tutorial demonstrates how to create videos and how to train a neural network to analyze them.

Additionally, we have seven more case studies which are less documented, but gives additional insight in how to use DeepTrack with real datasets

  1. MNIST classifies handwritted digits.
  2. single particle tracking tracks experimentally captured videos of a single particle. (Requires opencv-python compiled with ffmpeg to open and read a video.)
  3. single particle sizing extracts the radius and refractive index of particles.
  4. multi-particle tracking detects quantum dots in a low SNR image.
  5. 3-dimensional tracking tracks particles in three dimensions.
  6. cell counting counts the number of cells in fluorescence images.
  7. GAN image generation uses a GAN to create cell image from masks.

Video Tutorials

DeepTrack 2.0 introduction tutorial video: https://youtu.be/hyfaxF8q6VE
Tutorial

DeepTrack 2.0 recognizing handwritten digits tutorial video: https://youtu.be/QD9JUXyLJpc
Tutorial

DeepTrack 2.0 single particle tracking tutorial video: https://youtu.be/6Cntik6AfBI
Tutorial

DeepTrack 2.0 single-particle characterization tutorial video: https://youtu.be/ia2H1QO1cHg
Tutorial

DeepTrack 2.0 multiple particle tracking tutorial video: https://youtu.be/wFV2VqzpeZs
Tutorial

DeepTrack 2.0 multiple particle tracking in 3D tutorial video: https://youtu.be/fzD1QIEIJ04
Tutorial

DeepTrack 2.0 cell counting tutorial video: https://youtu.be/C6hu_IYoWtI
Tutorial

DeepTrack 2.0 GAN image generation tutorial video: https://youtu.be/8g44Yks7cis
Tutorial

In-depth dives

The examples folder contains notebooks which explains the different modules in more detail. These can be read in any order, but we provide a recommended order where more fundamental topics are introduced early. This order is as follows:

  1. features_example
  2. properties_example
  3. scatterers_example
  4. optics_example
  5. aberrations_example
  6. noises_example
  7. augmentations_example
  8. image_example
  9. generators_example
  10. models_example
  11. losses_example
  12. utils_example
  13. sequences_example
  14. math_example

Graphical user interface

DeepTrack 2.0 provides a completely stand-alone graphical user interface, which delivers all the power of DeepTrack without requiring programming knowledge.

InterfaceDemo

Documentation

The detailed documentation of DeepTrack 2.0 is available at the following link: https://softmatterlab.github.io/DeepTrack-2.0/deeptrack.html

Cite us!

If you use DeepTrack 2.0 in your project, please cite us here:

Benjamin Midtvedt, Saga Helgadottir, Aykut Argun, Jesús Pineda, Daniel Midtvedt, Giovanni Volpe. "Quantitative Digital Microscopy with Deep Learning." [arXiv:2010.08260](https://arxiv.org/abs/2010.08260)

See also:

Saga Helgadottir, Aykut Argun, and Giovanni Volpe. "Digital video microscopy enhanced by deep learning." Optica 6.4 (2019): 506-513. [10.1364/OPTICA.6.000506](https://doi.org/10.1364/OPTICA.6.000506)

Saga Helgadottir, Aykut Argun, and Giovanni Volpe. "DeepTrack." https://github.com/softmatterlab/DeepTrack.git (2019).

Funding

This work was supported by the ERC Starting Grant ComplexSwimmers (Grant No. 677511).

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeptrack-1.0.0a9.tar.gz (104.2 kB view hashes)

Uploaded Source

Built Distribution

deeptrack-1.0.0a9-py3-none-any.whl (120.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page