Processing Large-Scale PlanetScope Data
Project description
🧶 plaknit
Processing Large-Scale PlanetScope Data
Note: plaknit is in active early-stage development. Expect frequent updates, and please share feedback or ideas through the GitHub Issues tab.
-
Planet data is phenomenal for tracking change, but the current acquisition strategy sprays dozens of narrow strips across a scene. Without careful masking and mosaicking, even "cloud free" searches still include haze, seams, and nodata gaps.
-
PlanetScope scenes are also huge. Building clean, analysis-ready products requires an automated workflow that can run on laptops or HPC clusters where GDAL, rasterio, and Orfeo Toolbox are already available.
-
plaknitpackages the masking + mosaicking flow I rely on for regional mapping so the Planet community can stitch together reliable time series without copying shell scripts from old notebooks. -
Free software: MIT License
-
Documentation: https://dzfinch.github.io/plaknit
Features
- GDAL-powered parallel masking of Planet strips with their UDM rasters.
- Tuned Orfeo Toolbox mosaicking pipeline with RAM hints for large jobs.
- CLI + Python API that scale from local experimentation to HPC batch runs.
- Raster analysis helpers (e.g., normalized difference indices) built on rasterio.
- Random Forest training + inference utilities for classifying Planet stacks.
- Planning workflow that searches Planet's STAC/Data API, scores scenes, and (optionally) submits Orders API requests for clipped SR bundles.
Masking & Mosaicking CLI (stitch)
When the SR scenes land, run the bundled stitch driver (no extra scripting required). Point it at the clipped strips, their UDMs, and the desired output path; the command handles GDAL masking + Orfeo Toolbox mosaicking with parallel workers, RAM hints, and concise progress bars (Mask tiles → Binary mask → Mosaic):
plaknit stitch \
--inputs /data/planet/strips/*.tif \
--udms /data/planet/strips/*.udm2.tif \
--output /data/mosaics/planet_mosaic_2024.tif \
--sr-bands 8 \
--ndvi \
--jobs 8 \
--ram 196608
Customize --jobs, --ram, or --workdir/--tmpdir as needed for your local or
HPC environment. You can also invoke it as plaknit mosaic for backward compatibility.
Pass --ndvi to append NDVI (bands 4/3 for 4-band SR, 8/6 for 8-band SR) to the
output mosaic.
Planning & Ordering Monthly Planet Composites (Beta)
plaknit plan runs on your laptop or login node to query Planet's STAC/Data
API, apply environmental filters (clouds, sun elevation), tile the AOI, and
select a minimal set of scenes per month that hit both coverage and clear
observation depth targets. The same command can immediately turn those plans
into Planet orders that deliver clipped surface reflectance scenes (4- or 8-band,
optionally harmonized to Sentinel-2) as single-archive ZIPs chunked into orders
of up to 100 scenes.
plaknit plan \
--aoi aoi.gpkg \
--start 2024-01-01 \
--end 2024-12-31 \
--cloud-max 0.1 \
--sun-elev-min 35 \
--coverage-target 0.98 \
--min-clear-fraction 0.8 \
--min-clear-obs 3 \
--tile-size-m 1000 \
--sr-bands 8 \
--harmonize-to sentinel2 \
--out monthly_plan.json \
--order \
--order-prefix plk_region01
Planning + ordering stay on the non-HPC side; once scenes arrive (clipped to
the AOI and optionally harmonized), push them through plaknit stitch (alias
plaknit mosaic) or future compositing tools on HPC to build median reflectance
mosaics.
Already have a stored plan JSON/GeoJSON? Submit the corresponding orders later without replanning via:
plaknit order \
--plan monthly_plan.json \
--aoi aoi.gpkg \
--sr-bands 4 \
--harmonize-to none \
--order-prefix plk_region01 \
--archive-type zip
plaknit order reuses the original AOI for clip/harmonization settings,
applies optional harmonization, and prints a summary of each submitted order ID
(orders split into batches of ≤100 scenes with order/ZIP names suffixed _1,
_2, ... when needed).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file plaknit-0.1.0.tar.gz.
File metadata
- Download URL: plaknit-0.1.0.tar.gz
- Upload date:
- Size: 36.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
675202260b8604bcfa419c1282580409f0d5f0208b12ba40afca5477f31edd67
|
|
| MD5 |
9e17b3dc64cce5ad3df768247ca2fc12
|
|
| BLAKE2b-256 |
a1f5f808ed57b820acc74246cdc7d29225ca75ad102093fcb7fec597dcc19b63
|
Provenance
The following attestation bundles were made for plaknit-0.1.0.tar.gz:
Publisher:
release.yml on dzfinch/plaknit
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
plaknit-0.1.0.tar.gz -
Subject digest:
675202260b8604bcfa419c1282580409f0d5f0208b12ba40afca5477f31edd67 - Sigstore transparency entry: 802518996
- Sigstore integration time:
-
Permalink:
dzfinch/plaknit@d3f56b9b114f35989224d50c104525502f9fb452 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/dzfinch
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d3f56b9b114f35989224d50c104525502f9fb452 -
Trigger Event:
push
-
Statement type:
File details
Details for the file plaknit-0.1.0-py3-none-any.whl.
File metadata
- Download URL: plaknit-0.1.0-py3-none-any.whl
- Upload date:
- Size: 35.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c783b43d3c71f0765460136320da6d57a12acc55fefde14c029c74916576ab62
|
|
| MD5 |
98118e12d01a0ae698d0e9199082a255
|
|
| BLAKE2b-256 |
fbd909b3efc4299f8463815b8d08113a07face91f9f593c555856e0b02abc8e1
|
Provenance
The following attestation bundles were made for plaknit-0.1.0-py3-none-any.whl:
Publisher:
release.yml on dzfinch/plaknit
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
plaknit-0.1.0-py3-none-any.whl -
Subject digest:
c783b43d3c71f0765460136320da6d57a12acc55fefde14c029c74916576ab62 - Sigstore transparency entry: 802519066
- Sigstore integration time:
-
Permalink:
dzfinch/plaknit@d3f56b9b114f35989224d50c104525502f9fb452 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/dzfinch
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d3f56b9b114f35989224d50c104525502f9fb452 -
Trigger Event:
push
-
Statement type: