Skip to main content

Processing Large-Scale PlanetScope Data

Project description

plaknit

image image

Processing Large-Scale PlanetScope Data

  • Planet data is phenomenal for tracking change, but the current acquisition strategy sprays dozens of narrow strips across a scene. Without careful masking and mosaicking, even "cloud free" searches still include haze, seams, and nodata gaps.

  • PlanetScope scenes are also huge. Building clean, analysis-ready products requires an automated workflow that can run on laptops or HPC clusters where GDAL, rasterio, and Orfeo Toolbox are already available.

  • plaknit packages the masking + mosaicking flow I rely on for regional mapping so the Planet community can stitch together reliable time series without copying shell scripts from old notebooks.

  • Free software: MIT License

  • Documentation: https://dzfinch.github.io/plaknit

Features

  • GDAL-powered parallel masking of Planet strips with their UDM rasters.
  • Tuned Orfeo Toolbox mosaicking pipeline with RAM hints for large jobs.
  • CLI + Python API that scale from local experimentation to HPC batch runs.
  • Raster analysis helpers (e.g., normalized difference indices) built on rasterio.
  • Random Forest training + inference utilities for classifying Planet stacks.
  • Planning workflow that searches Planet's STAC/Data API, scores scenes, and (optionally) submits Orders API requests for clipped SR bundles.

Masking & Mosaicking CLI

When the SR scenes land, run the bundled mosaic driver (no extra scripting required). Point it at the clipped strips, their UDMs, and the desired output path; the command handles GDAL masking + Orfeo Toolbox mosaicking with parallel workers and RAM hints:

plaknit \
  --inputs /data/planet/strips/*.tif \
  --udms /data/planet/strips/*.udm2.tif \
  --output /data/mosaics/planet_mosaic_2024.tif \
  --jobs 8 \
  --ram 196608

Customize --jobs, --ram, or --workdir/--tmpdir as needed for your local or HPC environment. The CLI mirrors the legacy mosaic_planet.py workflow so you can keep using existing recipes with minimal tweaks.

Planning & Ordering Monthly Planet Composites (Beta)

plaknit plan runs on your laptop or login node to query Planet's STAC/Data API, apply environmental filters (clouds, sun elevation), tile the AOI, and select a minimal set of scenes per month that hit both coverage and clear observation depth targets. The same command can immediately turn those plans into Planet orders that deliver clipped surface reflectance scenes (4- or 8-band, optionally harmonized to Sentinel-2) as one ZIP per scene/bundle.

plaknit plan \
  --aoi aoi.gpkg \
  --start 2024-01-01 \
  --end 2024-12-31 \
  --cloud-max 0.1 \
  --sun-elev-min 35 \
  --coverage-target 0.98 \
  --min-clear-fraction 0.8 \
  --min-clear-obs 3 \
  --tile-size-m 1000 \
  --sr-bands 8 \
  --harmonize-to sentinel2 \
  --out monthly_plan.json \
  --order \
  --order-prefix plk_region01

Planning + ordering stay on the non-HPC side; once scenes arrive (clipped to the AOI and optionally harmonized), push them through plaknit mosaic or future compositing tools on HPC to build median reflectance mosaics.

Already have a stored plan JSON/GeoJSON? Submit the corresponding orders later without replanning via:

plaknit order \
  --plan monthly_plan.json \
  --aoi aoi.gpkg \
  --sr-bands 4 \
  --harmonize-to none \
  --order-prefix plk_region01 \
  --archive-type zip

plaknit order reuses the original AOI for clip/harmonization settings, applies optional harmonization, and prints a summary of each submitted order ID.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

plaknit-0.0.5.tar.gz (27.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

plaknit-0.0.5-py3-none-any.whl (26.0 kB view details)

Uploaded Python 3

File details

Details for the file plaknit-0.0.5.tar.gz.

File metadata

  • Download URL: plaknit-0.0.5.tar.gz
  • Upload date:
  • Size: 27.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for plaknit-0.0.5.tar.gz
Algorithm Hash digest
SHA256 521d3a19460b5d99616358daff24032f149c408aa0a5953ab6463e9766956c44
MD5 ef85ab1849833663d7a4b7e6e864f9b4
BLAKE2b-256 7028699a709a7fac9f46652cf6478cd53c47f50dc39818962cce956ffacc4335

See more details on using hashes here.

Provenance

The following attestation bundles were made for plaknit-0.0.5.tar.gz:

Publisher: release.yml on dzfinch/plaknit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file plaknit-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: plaknit-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 26.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for plaknit-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 19a7c1d2d4589b773760551a468f8e4de63cdfd7a6ef7d8f7885b396fbcce7db
MD5 c4d3b7959a5a923d752d7172e81c72a1
BLAKE2b-256 21f7fe64d94bccaeeb24eea29b043a247db53039585f406d51b60a99bbc134eb

See more details on using hashes here.

Provenance

The following attestation bundles were made for plaknit-0.0.5-py3-none-any.whl:

Publisher: release.yml on dzfinch/plaknit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page