Skip to main content

Search and download data or scenes from USGS API

Project description

Tests

usgsxplore

Python client for the USGS M2M API — search, download, and process Earth observation imagery from EarthExplorer.

Supports 100+ datasets (Landsat, Hexagon KH-9, declassified imagery, aerial photos, and more). Provides both a CLI and a Python API.

Inspired by landsatxplore, with broader dataset support and additional features.


Installation

pip install usgsxplore

# or with pipx (recommended for CLI use)
pipx install usgsxplore

Credentials

You need a USGS ERS account with M2M API access enabled.

  1. Register at ers.cr.usgs.gov/register
  2. Request M2M API access at ers.cr.usgs.gov/profile/access — specify the datasets you plan to use
  3. Use your username and M2M token (not your password)

Set credentials as environment variables to avoid typing them every time:

export USGS_USERNAME=<your_username>
export USGS_TOKEN=<your_token>

Quick start

# Search for Landsat scenes at a location between 2010 and 2020
usgsxplore search landsat_tm_c2_l1 --location 5.7074 45.1611 --interval-date 2010-01-01 2020-01-01

# Search for Hexagon KH-9 scenes and export to GeoPackage + HTML map
usgsxplore search declassii --filter "camera=H" --output results.gpkg --output map.html

# Download the first 10 results
usgsxplore search landsat_tm_c2_l1 --limit 10 --output results.txt
usgsxplore download results.txt

CLI Reference

usgsxplore [OPTIONS] COMMAND [ARGS]...

Commands:
  search           Search scenes in a dataset
  download         Download scenes from a text file of entity IDs
  download-browse  Download individual browse (preview) images from a vector file
  info             List available datasets and metadata filters

search

Search scenes in a dataset, with optional spatial, temporal, and metadata filters.

usgsxplore search [OPTIONS] DATASET
Option Description
-o / --output Output file — repeatable, format inferred from extension
-vf / --vector-file Vector file for spatial filter (.gpkg, .shp, .geojson)
-l / --location Point filter: longitude latitude
-b / --bbox Bounding box: xmin ymin xmax ymax
-c / --clouds Max cloud cover percentage (1–100)
-i / --interval-date Date range: YYYY-MM-DD YYYY-MM-DD
-f / --filter Metadata filter string (see Filter syntax)
-m / --limit Max number of results (default: all)
--pbar Show progress bar

Output formats:

Extension Content
.txt Entity IDs, one per line — usable with download
.json Raw API response with full metadata
.gpkg / .shp / .geojson Vector file with scene footprints
.html Interactive map for quick visualization

Multiple outputs can be specified simultaneously:

usgsxplore search declassii --filter "camera=H" --output scenes.gpkg --output map.html

download

Download scenes from a .txt file of entity IDs (produced by search).

usgsxplore download [OPTIONS] TEXTFILE
Option Description
-d / --dataset Dataset name (auto-read from file header if present)
-p / --product-number Product index when multiple products are available
-o / --output-dir Output directory (default: .)
-m / --max-workers Parallel download threads (default: 5)
--overwrite Overwrite existing files
--hide-pbar Hide progress bar
--no-extract Skip extraction of downloaded archives

The .txt file header line #dataset=<name> is read automatically, so passing -d is optional if the file was generated by search.

download-browse

Download individual browse (preview) images from a vector file. Each scene is saved as a separate file named after its entity_id.

usgsxplore download-browse [OPTIONS] VECTOR_FILE

TIF files are georeferenced using corner coordinate columns from the vector file. JPG files are saved without georeferencing.

Option Description
-o / --output-dir Output directory (default: ./browse_images/)
-f / --format Output format: tif (default) or jpg
-m / --max-workers Parallel download threads (default: 4)
--overwrite Overwrite existing files
--hide-pbar Hide progress bar
# Download as georeferenced GeoTIFF (default)
usgsxplore download-browse results.gpkg -o ./previews/

# Download as JPEG
usgsxplore download-browse results.gpkg -o ./previews/ --format jpg

info

# List all available datasets
usgsxplore info dataset

# List available metadata filters for a dataset
usgsxplore info filters DATASET

Tip: Trigger filter help directly from a search by using an invalid value:

# List all filter fields for the declassii dataset
usgsxplore search declassii -f "whatever=?"

# List all valid values for the "camera" filter
usgsxplore search declassii -f "camera=?"

Filter syntax

The --filter option accepts a human-readable expression:

"field1=value1 & field2=value2 | field3=value3"

Fields can be identified by their filter ID, label, or SQL field name. Values can be the raw value or the display label. All of the following are equivalent:

usgsxplore search declassii --filter "camera=L"
usgsxplore search declassii --filter "Camera Type=L"
usgsxplore search declassii --filter "5e839ff8cfa94807=L"
usgsxplore search declassii --filter "camera=KH-9 Lower Resolution Mapping Camera"

Combine multiple filters:

# KH-9 scenes that are available for download
usgsxplore search declassii --filter "camera=L & DOWNLOAD_AVAILABLE=Y"

Python API

All CLI commands have equivalent Python functions in usgsxplore.core:

from usgsxplore.core import (
    search_scenes,
    download_scenes,
    download_browse_images,
    list_datasets,
    list_dataset_filters,
)

Search

# Print entity IDs to stdout
search_scenes("landsat_tm_c2_l1",
    location=(5.7074, 45.1611),
    interval_date=("2010-01-01", "2020-01-01"),
)

# Save to multiple formats
search_scenes("declassii",
    output_files=["results.gpkg", "map.html"],
    filter_str="camera=H",
    limit=500,
    show_progress=True,
)

Credentials are read from USGS_USERNAME / USGS_TOKEN environment variables by default, or can be passed explicitly:

search_scenes("declassii", username="myuser", token="mytoken", ...)

Download

download_scenes("results.txt",
    output_dir="./data",
    max_workers=8,
)

Browse images

# Download as georeferenced GeoTIFF (default)
download_browse_images("results.gpkg", output_dir="./previews")

# Download as JPEG
download_browse_images("results.gpkg", output_dir="./previews", fmt="jpg")

# Advanced: use BrowseDownloader directly with a custom strategy
from usgsxplore.browse import BrowseDownloader, TifSaveStrategy, JpgSaveStrategy

downloader = BrowseDownloader("./previews", TifSaveStrategy(), max_workers=8)
downloader.download("results.gpkg")

Inspect datasets and filters

# List datasets
datasets = list_datasets()

# List filters for a dataset
filters = list_dataset_filters("declassii")
for f in filters:
    print(f["fieldLabel"], "→", f["searchSql"])

For a full example notebook, see examples/download.ipynb.


Contributing

See CONTRIBUTING.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

usgsxplore-1.1.2.tar.gz (32.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

usgsxplore-1.1.2-py3-none-any.whl (32.6 kB view details)

Uploaded Python 3

File details

Details for the file usgsxplore-1.1.2.tar.gz.

File metadata

  • Download URL: usgsxplore-1.1.2.tar.gz
  • Upload date:
  • Size: 32.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.10.20 HTTPX/0.28.1

File hashes

Hashes for usgsxplore-1.1.2.tar.gz
Algorithm Hash digest
SHA256 d50b3d334c66a2dbc8761b433221fb869d0d85ae5d451a908d8b6d15d7ab4495
MD5 da8d412f909cc2eb5ad0705fb7a0ef57
BLAKE2b-256 b81650400aa34c0fccfd2d59b0eeb7c9e4bb4583c1bc17ac64385750e9eda7b7

See more details on using hashes here.

File details

Details for the file usgsxplore-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: usgsxplore-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 32.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.10.20 HTTPX/0.28.1

File hashes

Hashes for usgsxplore-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6d18abd548e88c644019591f9272e725b10708a694b1f598b54cafce818d297b
MD5 369655990540913ac6fe6fefd222c6e1
BLAKE2b-256 ba3c0182ad3df472fd608447b4a96f2abf3c784c242e785d62868d0592f6eeee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page