Skip to main content

Search and download data or scenes from USGS API

Project description

Tests

usgsxplore

Python client for the USGS M2M API — search, download, and process Earth observation imagery from EarthExplorer.

Supports 100+ datasets (Landsat, Hexagon KH-9, declassified imagery, aerial photos, and more). Provides both a CLI and a Python API.

Inspired by landsatxplore, with broader dataset support and additional features.


Installation

pip install usgsxplore

# or with pipx (recommended for CLI use)
pipx install usgsxplore

Credentials

You need a USGS ERS account with M2M API access enabled.

  1. Register at ers.cr.usgs.gov/register
  2. Request M2M API access at ers.cr.usgs.gov/profile/access — specify the datasets you plan to use
  3. Use your username and M2M token (not your password)

Set credentials as environment variables to avoid typing them every time:

export USGS_USERNAME=<your_username>
export USGS_TOKEN=<your_token>

Quick start

# Search for Landsat scenes at a location between 2010 and 2020
usgsxplore search landsat_tm_c2_l1 --location 5.7074 45.1611 --interval-date 2010-01-01 2020-01-01

# Search for Hexagon KH-9 scenes and export to GeoPackage + HTML map
usgsxplore search declassii --filter "camera=H" --output results.gpkg --output map.html

# Download the first 10 results
usgsxplore search landsat_tm_c2_l1 --limit 10 --output results.txt
usgsxplore download results.txt

CLI Reference

usgsxplore [OPTIONS] COMMAND [ARGS]...

Commands:
  search           Search scenes in a dataset
  download         Download scenes from a text file of entity IDs
  download-browse  Download individual browse (preview) images from a vector file
  info             List available datasets and metadata filters

search

Search scenes in a dataset, with optional spatial, temporal, and metadata filters.

usgsxplore search [OPTIONS] DATASET
Option Description
-o / --output Output file — repeatable, format inferred from extension
-vf / --vector-file Vector file for spatial filter (.gpkg, .shp, .geojson)
-l / --location Point filter: longitude latitude
-b / --bbox Bounding box: xmin ymin xmax ymax
-c / --clouds Max cloud cover percentage (1–100)
-i / --interval-date Date range: YYYY-MM-DD YYYY-MM-DD
-f / --filter Metadata filter string (see Filter syntax)
-m / --limit Max number of results (default: all)
--pbar Show progress bar

Output formats:

Extension Content
.txt Entity IDs, one per line — usable with download
.json Raw API response with full metadata
.gpkg / .shp / .geojson Vector file with scene footprints
.html Interactive map for quick visualization

Multiple outputs can be specified simultaneously:

usgsxplore search declassii --filter "camera=H" --output scenes.gpkg --output map.html

download

Download scenes from a .txt file of entity IDs (produced by search).

usgsxplore download [OPTIONS] TEXTFILE
Option Description
-d / --dataset Dataset name (auto-read from file header if present)
-p / --product-number Product index when multiple products are available
-o / --output-dir Output directory (default: .)
-m / --max-workers Parallel download threads (default: 5)
--overwrite Overwrite existing files
--hide-pbar Hide progress bar
--no-extract Skip extraction of downloaded archives

The .txt file header line #dataset=<name> is read automatically, so passing -d is optional if the file was generated by search.

download-browse

Download individual browse (preview) images from a vector file. Each scene is saved as a separate file named after its entity_id.

usgsxplore download-browse [OPTIONS] VECTOR_FILE

TIF files are georeferenced using corner coordinate columns from the vector file. JPG files are saved without georeferencing.

Option Description
-o / --output-dir Output directory (default: ./browse_images/)
-f / --format Output format: tif (default) or jpg
-m / --max-workers Parallel download threads (default: 4)
--overwrite Overwrite existing files
--hide-pbar Hide progress bar
# Download as georeferenced GeoTIFF (default)
usgsxplore download-browse results.gpkg -o ./previews/

# Download as JPEG
usgsxplore download-browse results.gpkg -o ./previews/ --format jpg

info

# List all available datasets
usgsxplore info dataset

# List available metadata filters for a dataset
usgsxplore info filters DATASET

Tip: Trigger filter help directly from a search by using an invalid value:

# List all filter fields for the declassii dataset
usgsxplore search declassii -f "whatever=?"

# List all valid values for the "camera" filter
usgsxplore search declassii -f "camera=?"

Filter syntax

The --filter option accepts a human-readable expression:

"field1=value1 & field2=value2 | field3=value3"

Fields can be identified by their filter ID, label, or SQL field name. Values can be the raw value or the display label. All of the following are equivalent:

usgsxplore search declassii --filter "camera=L"
usgsxplore search declassii --filter "Camera Type=L"
usgsxplore search declassii --filter "5e839ff8cfa94807=L"
usgsxplore search declassii --filter "camera=KH-9 Lower Resolution Mapping Camera"

Combine multiple filters:

# KH-9 scenes that are available for download
usgsxplore search declassii --filter "camera=L & DOWNLOAD_AVAILABLE=Y"

Python API

All CLI commands have equivalent Python functions in usgsxplore.core:

from usgsxplore.core import (
    search_scenes,
    download_scenes,
    download_browse_images,
    list_datasets,
    list_dataset_filters,
)

Search

# Print entity IDs to stdout
search_scenes("landsat_tm_c2_l1",
    location=(5.7074, 45.1611),
    interval_date=("2010-01-01", "2020-01-01"),
)

# Save to multiple formats
search_scenes("declassii",
    output_files=["results.gpkg", "map.html"],
    filter_str="camera=H",
    limit=500,
    show_progress=True,
)

Credentials are read from USGS_USERNAME / USGS_TOKEN environment variables by default, or can be passed explicitly:

search_scenes("declassii", username="myuser", token="mytoken", ...)

Download

download_scenes("results.txt",
    output_dir="./data",
    max_workers=8,
)

Browse images

# Download as georeferenced GeoTIFF (default)
download_browse_images("results.gpkg", output_dir="./previews")

# Download as JPEG
download_browse_images("results.gpkg", output_dir="./previews", fmt="jpg")

# Advanced: use BrowseDownloader directly with a custom strategy
from usgsxplore.browse import BrowseDownloader, TifSaveStrategy, JpgSaveStrategy

downloader = BrowseDownloader("./previews", TifSaveStrategy(), max_workers=8)
downloader.download("results.gpkg")

Inspect datasets and filters

# List datasets
datasets = list_datasets()

# List filters for a dataset
filters = list_dataset_filters("declassii")
for f in filters:
    print(f["fieldLabel"], "→", f["searchSql"])

For a full example notebook, see examples/download.ipynb.


Contributing

See CONTRIBUTING.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

usgsxplore-1.1.3.tar.gz (33.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

usgsxplore-1.1.3-py3-none-any.whl (32.6 kB view details)

Uploaded Python 3

File details

Details for the file usgsxplore-1.1.3.tar.gz.

File metadata

  • Download URL: usgsxplore-1.1.3.tar.gz
  • Upload date:
  • Size: 33.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.10.20 HTTPX/0.28.1

File hashes

Hashes for usgsxplore-1.1.3.tar.gz
Algorithm Hash digest
SHA256 dc9e4e0b081ea0dedffb7786d3f88bdf8c32cf8412765e43ba6a7f71d2131195
MD5 7ba513804d8b666fa3da9370396c5356
BLAKE2b-256 1750f94d124171f2f05409a0dd2bb36e30a66677e8aaca8966e8d3c17a063138

See more details on using hashes here.

File details

Details for the file usgsxplore-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: usgsxplore-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 32.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.10.20 HTTPX/0.28.1

File hashes

Hashes for usgsxplore-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 2413dad6959f99179c66d4fb2000a775b609618990787e2d41a541b25107d6e0
MD5 9057a244c9959cc7b5409b8708436b96
BLAKE2b-256 ae8db2e201f37c63e6db314cd696717c8486124ddce153782fdf3f238c444f24

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page