Skip to main content

Composable image chunk operators to create pipeline for distributed computation.

Project description

chunkflow

Documentation Status Build Status PyPI version Coverage Status License Docker Hub Twitter URL

Perform Convolutional net inference to segment 3D image volume with one single command!

chunkflow read-tif --file-name path/of/image.tif -o image inference --convnet-model path/of/model.py --convnet-weight-path path/of/weight.pt --input-patch-size 20 256 256 --output-patch-overlap 4 64 64 --num-output-channels 3 -f pytorch --batch-size 12 --mask-output-chunk -i image -o affs write-h5 -i affs --file-name affs.h5 agglomerate --threshold 0.7 --aff-threshold-low 0.001 --aff-threshold-high 0.9999 -i affs -o seg write-tif -i seg -f seg.tif neuroglancer -c image,affs,seg -p 33333 -v 30 6 6

you can see your 3D image and segmentation directly in Neuroglancer!

Image_Segmentation

Features

  • Composable operators. The chunk operators could be freely composed in commandline for flexible usage.
  • Hybrid Cloud Distributed computation in both local and cloud computers. The task scheduling frontend and computationally heavy backend are decoupled using AWS Simple Queue Service. The computational heavy backend could be any computer with internet connection and Amazon Web Services (AWS) authentication.
  • All operations support 3D image volumes.

Operators

After installation, You can simply type chunkflow and it will list all the operators with help message. We list the available operators here. We keep adding new operators and will keep it update here. For the detailed usage, please checkout our Documentation.

Operator Name Function
agglomerate Watershed and agglomeration to segment affinity map
cloud-watch Realtime speedometer in AWS CloudWatch
connected-components Threshold the boundary map to get a segmentation
copy-var Copy a variable to a new name
create-chunk Create a fake chunk for easy test
crop-margin Crop the margin of a chunk
custom-operator Import local code as a customized operator
cutout Cutout chunk from a local/cloud storage volume
delete-task-in-queue Delete the task in AWS SQS queue
downsample-upload Downsample the chunk hierarchically and upload to volume
evaluate-segmentation Compare segmentation chunks
fetch-task Fetch task from AWS SQS queue one by one
generate-tasks Generate tasks one by one
inference Convolutional net inference
log-summary Summary of logs
mask Black out the chunk based on another mask chunk
mesh Build 3D meshes from segmentation chunk
mesh-manifest Collect mesh fragments for object
neuroglancer Visualize chunks using neuroglancer
normalize-section-contrast Normalize image contrast
normalize-section-shang Normalization algorithm created by Shang
quantize Quantize the affinity map
read-h5 Read HDF5 files
read-tif Read TIFF files
save Save chunk to local/cloud storage volume
save-pngs Save chunk as a serials of png files
setup-env Prepare storage infor files and produce tasks
view Another chunk viewer in browser using CloudVolume
write-h5 Write chunk as HDF5 file
write-tif Write chunk as TIFF file

Reference

We have a paper of this repo:

@article{wu2019chunkflow,
  title={Chunkflow: Distributed Hybrid Cloud Processing of Large 3D Images by Convolutional Nets},
  author={Wu, Jingpeng and Silversmith, William M and Seung, H Sebastian},
  journal={arXiv preprint arXiv:1904.10489},
  year={2019}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chunkflow-0.5.8.tar.gz (63.1 kB view details)

Uploaded Source

File details

Details for the file chunkflow-0.5.8.tar.gz.

File metadata

  • Download URL: chunkflow-0.5.8.tar.gz
  • Upload date:
  • Size: 63.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.1.post20191125 requests-toolbelt/0.9.1 tqdm/4.39.0 CPython/3.7.5

File hashes

Hashes for chunkflow-0.5.8.tar.gz
Algorithm Hash digest
SHA256 fe22056673981b05fe01f0be110055dfba278b30268e9b6bbbfd996e1612a0da
MD5 07cd22a03a4613d60cbda61af66ef42d
BLAKE2b-256 fc5cc87f98713b2ff40c5aad0bc99e3767b5e44bd8263c6d9cb617305e9f0c1b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page