Skip to main content

Processing Large-Scale PlanetScope Data

Project description

🧶 plaknit

image image

Processing Large-Scale PlanetScope Data

Note: plaknit is in active early-stage development. Expect frequent updates, and please share feedback or ideas through the GitHub Issues tab.

  • Planet data is phenomenal for tracking change, but the current acquisition strategy sprays dozens of narrow strips across a scene. Without careful masking and mosaicking, even "cloud free" searches still include haze, seams, and nodata gaps.

  • PlanetScope scenes are also huge. Building clean, analysis-ready products requires an automated workflow that can run on laptops or HPC clusters where GDAL, rasterio, and Orfeo Toolbox are already available.

  • plaknit packages the masking + mosaicking flow I rely on for regional mapping so the Planet community can stitch together reliable time series without copying shell scripts from old notebooks.

  • Free software: MIT License

  • Documentation: https://dzfinch.github.io/plaknit

Features

  • GDAL-powered parallel masking of Planet strips with their UDM rasters.
  • Tuned Orfeo Toolbox mosaicking pipeline with RAM hints for large jobs.
  • CLI + Python API that scale from local experimentation to HPC batch runs.
  • Raster analysis helpers (e.g., normalized difference indices) built on rasterio.
  • Random Forest training + inference utilities for classifying Planet stacks.
  • Planning workflow that searches Planet's STAC/Data API, scores scenes, and (optionally) submits Orders API requests for clipped SR bundles.

Masking & Mosaicking CLI (stitch)

When the SR scenes land, run the bundled stitch driver (no extra scripting required). Point it at the clipped strips, their UDMs, and the desired output path; the command handles GDAL masking + Orfeo Toolbox mosaicking with parallel workers, RAM hints, and concise progress bars (Mask tiles → Binary mask → Mosaic):

plaknit stitch \
  --inputs /data/planet/strips/*.tif \
  --udms /data/planet/strips/*.udm2.tif \
  --output /data/mosaics/planet_mosaic_2024.tif \
  --sr-bands 8 \
  --ndvi \
  --jobs 8 \
  --ram 196608

Customize --jobs, --ram, or --workdir/--tmpdir as needed for your local or HPC environment. You can also invoke it as plaknit mosaic for backward compatibility. Pass --ndvi to append NDVI (bands 4/3 for 4-band SR, 8/6 for 8-band SR) to the output mosaic.

Planning & Ordering Monthly Planet Composites (Beta)

plaknit plan runs on your laptop or login node to query Planet's STAC/Data API, apply environmental filters (clouds, sun elevation), tile the AOI, and select a minimal set of scenes per month that hit both coverage and clear observation depth targets. The same command can immediately turn those plans into Planet orders that deliver clipped surface reflectance scenes (4- or 8-band, optionally harmonized to Sentinel-2) as single-archive ZIPs chunked into orders of up to 100 scenes.

plaknit plan \
  --aoi aoi.gpkg \
  --start 2024-01-01 \
  --end 2024-12-31 \
  --cloud-max 0.1 \
  --sun-elev-min 35 \
  --coverage-target 0.98 \
  --min-clear-fraction 0.8 \
  --min-clear-obs 3 \
  --tile-size-m 1000 \
  --sr-bands 8 \
  --harmonize-to sentinel2 \
  --out monthly_plan.json \
  --order \
  --order-prefix plk_region01

Planning + ordering stay on the non-HPC side; once scenes arrive (clipped to the AOI and optionally harmonized), push them through plaknit stitch (alias plaknit mosaic) or future compositing tools on HPC to build median reflectance mosaics.

Already have a stored plan JSON/GeoJSON? Submit the corresponding orders later without replanning via:

plaknit order \
  --plan monthly_plan.json \
  --aoi aoi.gpkg \
  --sr-bands 4 \
  --harmonize-to none \
  --order-prefix plk_region01 \
  --archive-type zip

plaknit order reuses the original AOI for clip/harmonization settings, applies optional harmonization, and prints a summary of each submitted order ID (orders split into batches of ≤100 scenes with order/ZIP names suffixed _1, _2, ... when needed).

Order output arguments:

  • --plan: Plan JSON/GeoJSON that defines which scene IDs (and months) are ordered.
  • --aoi: Geometry used for clipping; the clip AOI is applied to delivered scenes.
  • --sr-bands: Chooses 4- or 8-band SR bundle; changes the bands in each scene.
  • --harmonize-to: sentinel2 harmonizes to Sentinel-2; none keeps native SR.
  • --order-prefix: Prefix for order name and archive filename; batches append _2, _3, etc., and ZIPs end with .zip.
  • --archive-type: Delivery archive format; Planet currently supports zip only.
  • --single-archive / --no-single-archive: One ZIP per order vs per-scene files.
  • -v / -vv: Verbose logging for submissions and retries; no change to output.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

plaknit-0.1.1.tar.gz (36.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

plaknit-0.1.1-py3-none-any.whl (35.9 kB view details)

Uploaded Python 3

File details

Details for the file plaknit-0.1.1.tar.gz.

File metadata

  • Download URL: plaknit-0.1.1.tar.gz
  • Upload date:
  • Size: 36.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for plaknit-0.1.1.tar.gz
Algorithm Hash digest
SHA256 8faba369cfa910366f43a4325b4020b95c42dd7b19892d52314f0c0b51997c0b
MD5 e7511dccc4c9343a0e862bb5f7301baa
BLAKE2b-256 3ca5cb8954da229bb0c9648cd5c42224a5f920fe009a4baae09ffbf28e38e20f

See more details on using hashes here.

Provenance

The following attestation bundles were made for plaknit-0.1.1.tar.gz:

Publisher: release.yml on dzfinch/plaknit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file plaknit-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: plaknit-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 35.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for plaknit-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d35a89ca44839d4d8357837c8b83b9e7b855659754e40bb62de2b84ec3c37b19
MD5 be2bc2e682a4a73d6f077a188e177c27
BLAKE2b-256 6cd219c383fbc340083a0b09610332149a25a9d7ca5380a3facda7466e0f49f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for plaknit-0.1.1-py3-none-any.whl:

Publisher: release.yml on dzfinch/plaknit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page