Skip to main content

CLI utils for working with a datacube index

Project description

odc.apps.dc_tools

Command line utilities for working with datacube index

Installation

pip install odc-apps-dc-tools

Usage

dc-sync-products

The tool dc-sync-products helps you by keeping a Datacube instance's list of products up to date with a CSV list of product names and definitions.

Basic usage is:

dc-sync-products <path-to-csv> --update-if-exists

The --update-if-exists flag is optional, and will update a product, including unsafe changes, if it already exists. The format for the CSV is as follows (note that you can have multiple products defined in one file):

product,definition
dem_srtm,https://raw.githubusercontent.com/digitalearthafrica/config/master/products/dem_srtm.odc-product.yaml
ls5_c2l2_sr;ls7_c2l2_sr;ls8_c2l2_sr;ls9_c2l2_sr,https://raw.githubusercontent.com/opendatacube/datacube-dataset-config/main/products/lsX_c2l2_sr.odc-product.yaml

dc-index-export-md

Metadata transformer

Simple usage:

TODO:

Extended usage:

TODO:

dc-index-from-tar

Index ODC metadata that is contained in a .tar file

Simple usage:

dc-index-from-tar 'path/to/file.tar'

Extended usage:

TODO:

sqs-to-dc

A tool to index from an SQS queue

Simple usage:

sqs-to-dc example-queue-name 'product-name-a product-name-b'

Extended usage:

Usage: sqs-to-dc [OPTIONS] QUEUE_NAME PRODUCT

  Iterate through messages on an SQS queue and add them to datacube

Options:
  --skip-lineage                  Default is not to skip lineage. Set to skip
                                  lineage altogether.

  --fail-on-missing-lineage / --auto-add-lineage
                                  Default is to fail if lineage documents not
                                  present in the database. Set auto add to try
                                  to index lineage documents.

  --verify-lineage                Default is no verification. Set to verify
                                  parent dataset definitions.

  --stac                          Expect STAC 1.0 metadata and attempt to
                                  transform to ODC EO3 metadata

  --odc-metadata-link TEXT        Expect metadata doc with ODC EO3 metadata
                                  link. Either provide '/' separated path to
                                  find metadata link in a provided metadata
                                  doc e.g. 'foo/bar/link', or if metadata doc
                                  is STAC, provide 'rel' value of the 'links'
                                  object having metadata link. e.g. 'STAC-
                                  LINKS-REL:odc_yaml'

  --limit INTEGER                 Stop indexing after n datasets have been
                                  indexed.

  --update                        If set, update instead of add datasets
  --update-if-exists              If the dataset already exists, update it
                                  instead of skipping it.

  --archive                       If set, archive datasets
  --allow-unsafe                  Allow unsafe changes to a dataset. Take
                                  care!

  --record-path TEXT              Filtering option for s3 path, i.e.
                                  'L2/sentinel-2-nrt/S2MSIARD/*/*/ARD-
                                  METADATA.yaml'

  --region-code-list-uri TEXT     A path to a list (one item per line, in txt
                                  or gzip format) of valide region_codes to
                                  include

  --absolute                      Use absolute paths when converting from stac

  --archive-less-mature           Find less mature versions of the dataset and
                                  archive them
                                  
  --publish-action SNS ARN        Publish indexing action to SNS topic

  --help                          Show this message and exit.

s3-to-dc

A tool for indexing from S3.

Simple usage:

s3-to-dc 's3://bucket/path/**/*.yaml' 'product-name-a product-name-b'

Extended usage:

The following command updates the datasets instead of adding them and allows unsafe changes. Be careful!

Usage: s3-to-dc [OPTIONS] URI PRODUCT

  Iterate through files in an S3 bucket and add them to datacube

Options:
  --skip-lineage                  Default is not to skip lineage. Set to skip
                                  lineage altogether.

  --fail-on-missing-lineage / --auto-add-lineage
                                  Default is to fail if lineage documents not
                                  present in the database. Set auto add to try
                                  to index lineage documents.

  --verify-lineage                Default is no verification. Set to verify
                                  parent dataset definitions.

  --stac                          Expect STAC 1.0 metadata and attempt to
                                  transform to ODC EO3 metadata

  --update                        If set, update instead of add datasets
  --update-if-exists              If the dataset already exists, update it
                                  instead of skipping it.

  --allow-unsafe                  Allow unsafe changes to a dataset. Take
                                  care!

  --skip-check                    Assume file exists when listing exact file
                                  rather than wildcard.

  --no-sign-request               Do not sign AWS S3 requests
  --request-payer                 Needed when accessing requester pays public
                                  buckets

  --archive-less-mature           Find less mature versions of the dataset and
                                  archive them

  --publish-action SNS ARN        Publish indexing action to SNS topic

  --help                          Show this message and exit.

thredds-to-dc

Index from a THREDDS server

Simple usage:

TODO:

Extended usage:

TODO:

esri-lc-to-dc

Removed, use the stac-to-dc tool instead.

  stac-to-dc \
    --catalog-href=https://planetarycomputer.microsoft.com/api/stac/v1/ \
    --collections='io-lulc'

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

odc_apps_dc_tools-1.9.2.tar.gz (45.2 kB view details)

Uploaded Source

Built Distribution

odc_apps_dc_tools-1.9.2-py3-none-any.whl (45.2 kB view details)

Uploaded Python 3

File details

Details for the file odc_apps_dc_tools-1.9.2.tar.gz.

File metadata

  • Download URL: odc_apps_dc_tools-1.9.2.tar.gz
  • Upload date:
  • Size: 45.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for odc_apps_dc_tools-1.9.2.tar.gz
Algorithm Hash digest
SHA256 c6fb3ca2f97fa879423fbe75db0f0f6230bb0ba20ecd895c4db6b4efaf301f98
MD5 e8496bdde1c2794f8df453c4e959b6e2
BLAKE2b-256 3a7d34d03fa6994aba91b4f8fa832ca4c0bce5172fbba6281a7babc768f81358

See more details on using hashes here.

File details

Details for the file odc_apps_dc_tools-1.9.2-py3-none-any.whl.

File metadata

File hashes

Hashes for odc_apps_dc_tools-1.9.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d00a9570777b11769afacd41dd115a5922f4566efde12f3af0f0e10398820d77
MD5 7562ef46d0104420a6bde7a2286aee33
BLAKE2b-256 9e690aae66cd92416d1ab614502f4216ee55be6baa8b64e9f62bbf0abe8c6356

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page