Skip to main content

A SAM-based model for segmenting grains in images

Project description

segmenteverygrain

Description

'segmenteverygrain' is a Python package that aims to detect grains (or grain-like objects) in images. The goal is to develop an ML model that does a reasonably good job at detecting most of the grains in a photo, so that it will be useful for determining grain size and grain shape, a common task in geomorphology and sedimentary geology. 'segmenteverygrain' relies on the Segment Anything Model (SAM), developed by Meta, for getting high-quality outlines of the grains. However, SAM requires prompts for every object detected and, when used in 'everything' mode, it tends to be slow and results in many overlapping masks and non-grain (background) objects. To deal with these issues, 'segmenteverygrain' relies on a Unet-style, patch-based convolutional neural network to create a first-pass segmentation which is then used as a set of prompts for the SAM-based segmentation. Some of the grains will be missed with this approach, but the segmentations that are created tend to be of high quality.

'segmenteverygrain' also includes a set of functions that make it possible to clean up the segmentation results: delete and merge objects by clicking on them, and adding grains that were not segmented automatically. The QC-d masks can be saved and added to a dataset of grain images (see the 'images' folder). These images then can be used to improve the Unet model. Many of the images used in the dataset are from the sedinet project.

This is work in progress.

Requirements

  • numpy
  • matplotlib
  • scipy
  • pandas
  • pillow
  • scikit-image
  • opencv-python
  • networkx
  • rasterio
  • shapely
  • tensorflow
  • pytorch
  • segment-anything
  • tqdm

Installation

pip install segmenteverygrain

Getting started

See the Segment_every_grain.ipynb notebook for an example of how the models can be loaded and used for segmenting an image and QC-ing the result.

The Train_seg_unet_model.ipynb notebook goes through the steps needed to create, train, and test the Unet model.

The Segment_every_grain_colab.ipynb has been adjusted so that the segmentation can be tested in Google Colab. That said, the interactivity in Colab is not as smooth as in a local notebook.

Acknowledgements

Thanks to Danny Stockli, Nick Howes, Kalinda Roberts, Jake Covault, Matt Malkowski, Raymond Luong, and Sergey Fomel for discussions and/or helping with generating training data. Funding for this work came from the Quantitative Clastics Laboratory industrial consortium at the Bureau of Economic Geology, The University of Texas at Austin.

License

segmenteverygrain is licensed under the Apache License 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

segmenteverygrain-0.1.0-py3-none-any.whl (23.3 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page