Skip to main content

Tiles router for xpublish

Project description

xpublish-tiles

PyPI - Version

Web mapping plugins for Xpublish

Project Overview

This project contains a set of web mapping plugins for Xpublish - a framework for serving xarray datasets via HTTP APIs.

The goal of this project is to transform xarray datasets to raster, vector and other types of tiles, which can then be served via HTTP APIs. To do this, the package implements a set of xpublish plugins:

  • xpublish_tiles.xpublish.tiles.TilesPlugin: An OGC Tiles conformant plugin for serving raster, vector and other types of tiles.
  • xpublish_tiles.xpublish.wms.WMSPlugin: An OGC Web Map Service conformant plugin for serving raster, vector and other types of tiles.

Development

Sync the environment with uv

uv sync

Run the type checker

uv run ty check

Run the tests

uv run pytest tests

Run setup tests (create local datasets, these can be deployed using the CLI)

uv run pytest --setup

CLI Usage

The package includes a command-line interface for quickly serving datasets with tiles and WMS endpoints:

uv run xpublish-tiles [OPTIONS]

Options

  • --port PORT: Port to serve on (default: 8080)
  • --dataset DATASET: Dataset to serve (default: global)
    • global: Generated global dataset with synthetic data
    • air: Tutorial air temperature dataset from xarray tutorial
    • hrrr: High-Resolution Rapid Refresh dataset
    • para: Parameterized dataset
    • eu3035: European dataset in ETRS89 / LAEA Europe projection
    • eu3035_hires: High-resolution European dataset
    • ifs: Integrated Forecasting System dataset
    • curvilinear: Curvilinear coordinate dataset
    • sentinel: Sentinel-2 dataset (without coordinates)
    • global-6km: Global dataset at 6km resolution
    • xarray://<tutorial_name>: Load any xarray tutorial dataset (e.g., xarray://rasm)
    • local://<dataset_name>: Load dataset from local icechunk repository at /tmp/tiles-icechunk/ (datasets created with uv run pytest --setup)
    • local:///path/to/repo::<dataset_name>: Load dataset from custom icechunk repository path
    • For Arraylake datasets: specify the dataset name in {arraylake_org}/{arraylake_dataset} format (requires Arraylake credentials)
  • --branch BRANCH: Branch to use for Arraylake or icechunk datasets (default: main)
  • --group GROUP: Group to use for Arraylake datasets (default: '')
  • --cache: Enable icechunk cache for Arraylake and local icechunk datasets (default: enabled)
  • --spy: Run benchmark requests with the specified dataset for performance testing
  • --concurrency INT: Number of concurrent requests for benchmarking (default: 12)
  • --where CHOICE: Where to run benchmark requests (choices: local, local-booth, prod; default: local)
    • local: Start server on localhost and run benchmarks against it
    • local-booth: Run benchmarks against existing localhost server (no server startup)
    • prod: Run benchmarks against production server
  • --log-level LEVEL: Set the logging level for xpublish_tiles (choices: debug, info, warning, error; default: warning)

[!TIP] To use local datasets (e.g., local://ifs, local://para_hires), first create them with uv run pytest --setup. This creates icechunk repositories at /tmp/tiles-icechunk/.

Examples

# Serve synthetic global dataset on default port 8080
xpublish-tiles

# Serve air temperature tutorial dataset on port 9000
xpublish-tiles --port 9000 --dataset air

# Serve built-in test datasets
xpublish-tiles --dataset hrrr
xpublish-tiles --dataset para
xpublish-tiles --dataset eu3035_hires

# Load xarray tutorial datasets
xpublish-tiles --dataset xarray://rasm
xpublish-tiles --dataset xarray://ersstv5

# Serve locally stored datasets (first create them with `uv run pytest --setup`)
xpublish-tiles --dataset local://ifs
xpublish-tiles --dataset local://para_hires

# Serve local icechunk data from custom path
xpublish-tiles --dataset local:///path/to/my/repo::my_dataset

# Serve Arraylake dataset with specific branch and group
xpublish-tiles --dataset earthmover-public/aifs-outputs --branch main --group 2025-04-01/12z

# Run benchmark with a specific dataset
xpublish-tiles --dataset local://para_hires --spy

# Run benchmark with custom concurrency and against production
xpublish-tiles --dataset para --spy --concurrency 20 --where prod

# Enable debug logging
xpublish-tiles --dataset hrrr --log-level debug

Benchmarking

The CLI includes a benchmarking feature that can be used to test tile server performance:

# Run benchmark with a specific dataset (starts server automatically)
xpublish-tiles --dataset local://para_hires --spy

# Run benchmark against existing localhost server
xpublish-tiles --dataset para --spy --where local-booth

# Run benchmark against production server with custom concurrency
xpublish-tiles --dataset para --spy --where prod --concurrency 8

The --spy flag enables benchmarking mode. The benchmarking behavior depends on the --where option:

  • --where local (default): Starts the tile server and automatically runs benchmark requests against it
  • --where local-booth: Runs benchmarks against an existing localhost server (doesn't start a new server)
  • --where prod: Runs benchmarks against a production server

The benchmarking process:

  • Warms up the server with initial tile requests
  • Makes concurrent tile requests (configurable with --concurrency, default: 12) to test performance
  • Uses dataset-specific benchmark tiles or falls back to global tiles
  • Automatically exits after completing the benchmark run
  • Uses appropriate colorscale ranges based on dataset attributes

Once running, the server provides:

  • Tiles API at http://localhost:8080/tiles/
  • WMS API at http://localhost:8080/wms/
  • Interactive API documentation at http://localhost:8080/docs

An example tile url:

http://localhost:8080/tiles/WebMercatorQuad/4/4/14?variables=2t&style=raster/viridis&colorscalerange=280,300&width=256&height=256&valid_time=2025-04-03T06:00:00

Where 4/4/14 represents the tile coordinates in {z}/{y}/{x}

Integration Examples

Deployment notes

  1. Make sure to limit NUMBA_NUM_THREADS; this is used for rendering categorical data with datashader.
  2. The first invocation of a render will block while datashader functions are JIT-compiled. Our attempts to add a precompilation step to remove this have been unsuccessful.

Environment variables

  1. XPUBLISH_TILES_ASYNC_LOAD: [0, 1] - controls whether Xarray's async loading is used.
  2. XPUBLISH_TILES_NUM_THREADS: int - controls the size of the threadpool
  3. XPUBLISH_TILES_TRANSFORM_CHUNK_SIZE: int - when transforming coordinates, do so by submitting (NxN) chunks to the threadpool.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xpublish_tiles-0.1.13.tar.gz (4.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xpublish_tiles-0.1.13-py3-none-any.whl (81.5 kB view details)

Uploaded Python 3

File details

Details for the file xpublish_tiles-0.1.13.tar.gz.

File metadata

  • Download URL: xpublish_tiles-0.1.13.tar.gz
  • Upload date:
  • Size: 4.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for xpublish_tiles-0.1.13.tar.gz
Algorithm Hash digest
SHA256 8ab7b62d6701a3a895bceada851ad7d7064f3cb0e68c33fbbb4038ebeb82816a
MD5 fb097ef5c16aea8dd0ec5a2d163aeed0
BLAKE2b-256 7025425aca081b506e45b16c8b2a199615d9a0c6f8c9fcac71858dd73d9abf45

See more details on using hashes here.

Provenance

The following attestation bundles were made for xpublish_tiles-0.1.13.tar.gz:

Publisher: publish.yml on earth-mover/xpublish-tiles

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file xpublish_tiles-0.1.13-py3-none-any.whl.

File metadata

File hashes

Hashes for xpublish_tiles-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 4b3405d7169b16ab6c6e8c34f61aee813d6842365eccfcd7548f7a74b7264966
MD5 a1b962f021503b39fed4c4cd8b0fedd9
BLAKE2b-256 1969ab99a0e085bb03b3b6f06d75714c585ac2ce94b75469b96359eebfd6e8e5

See more details on using hashes here.

Provenance

The following attestation bundles were made for xpublish_tiles-0.1.13-py3-none-any.whl:

Publisher: publish.yml on earth-mover/xpublish-tiles

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page