CLI utils for working with a datacube index
Project description
odc.apps.dc_tools
Command line utilities for working with datacube index
Installation
pip install odc-apps-dc-tools
Usage
dc-index-export-md
Metadata transformer
Simple usage:
TODO:
Extended usage:
TODO:
dc-index-from-tar
Index ODC metadata that is contained in a .tar file
Simple usage:
dc-index-from-tar 'path/to/file.tar'
Extended usage:
TODO:
sqs-to-dc
A tool to index from an SQS queue
Simple usage:
sqs-to-dc example-queue-name 'product-name-a product-name-b'
Extended usage:
Usage: sqs-to-dc [OPTIONS] QUEUE_NAME PRODUCT
Iterate through messages on an SQS queue and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--odc-metadata-link TEXT Expect metadata doc with ODC EO3 metadata
link. Either provide '/' separated path to
find metadata link in a provided metadata
doc e.g. 'foo/bar/link', or if metadata doc
is STAC, provide 'rel' value of the 'links'
object having metadata link. e.g. 'STAC-
LINKS-REL:odc_yaml'
--limit INTEGER Stop indexing after n datasets have been
indexed.
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--archive If set, archive datasets
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--record-path TEXT Filtering option for s3 path, i.e.
'L2/sentinel-2-nrt/S2MSIARD/*/*/ARD-
METADATA.yaml'
--region-code-list-uri TEXT A path to a list (one item per line, in txt
or gzip format) of valide region_codes to
include
--absolute Use absolute paths when converting from stac
--help Show this message and exit.
s3-to-dc
A tool for indexing from S3.
Simple usage:
s3-to-dc 's3://bucket/path/**/*.yaml' 'product-name-a product-name-b'
Extended usage:
The following command updates the datasets instead of adding them and allows unsafe changes. Be careful!
Usage: s3-to-dc [OPTIONS] URI PRODUCT
Iterate through files in an S3 bucket and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--skip-check Assume file exists when listing exact file
rather than wildcard.
--no-sign-request Do not sign AWS S3 requests
--request-payer Needed when accessing requester pays public
buckets
--help Show this message and exit.
thredds-to-dc
Index from a THREDDS server
Simple usage:
TODO:
Extended usage:
TODO:
esri-lc-to-dc
Index the global 10 m ESRI Land Cover data.
Simple usage:
Index all the data and add the product first.
esri-lc-to-dc --add-product
Extended usage:
Index all the data, add the product, set a limit and update scenes that already are indexed.
esri-lc-to-dc --add-product --limit 1000 --update
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for odc-apps-dc-tools-0.2.0a3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0cc6c23548c8cf30b8a78cb2476afd05383632026021834b73ddedb4fddc4115 |
|
MD5 | 2b265bbffc40d795d860372fcd393e2c |
|
BLAKE2b-256 | 26b38905ec17ed7581d910fc11ede29a4be8a9a9165091003061b3cb211dcda5 |
Hashes for odc_apps_dc_tools-0.2.0a3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7755c769ff81523c61cb75042ae7b18882c5a8cc6acc55715dc51f14a0d4b196 |
|
MD5 | 4ef0e61dfa14a3ab72381e35a62768aa |
|
BLAKE2b-256 | 228bbcf4d7c5c6add3b7975dddc0cc5832fe19900c211c380b208b9c930412c8 |