Skip to main content

Python client library for WCPS (OGC Web Coverage Processing Service) backends.

Project description

Overview

The OGC Web Coverage Processing Service (WCPS) standard defines a protocol-independent declarative query language for the extraction, processing, and analysis of multi-dimensional coverages (datacubes) representing sensor, image, or statistics data.

This Python library allows to dynamically build WCPS queries and execute on a WCPS server. To query a WCS server for information on available data, check the WCS Python Client.

Installation

pip install wcps

Examples

Subsetting

Extracting spatio-temporal can be done with the subscripting operator [], by specifying lower and upper bounds for the axes we want to trim, or a single bound to slice the axis at a particular index.

from wcps.model import Datacube

# Slice the time axis (with name ansi) at "2021-04-09",
# and trim on the spatial axes
cov = Datacube("S2_L2A_32631_TCI_60m")[
      "ansi" : "2021-04-09",
      "E" : 669960 : 700000,
      "N" : 4990200 : 5015220 ]

# encode final result to JPEG
query = cov.encode("JPEG")

The Service class allows to execute the query on the server and get back a WCPSResult object. Optionally, array results can be automatically converted to a numpy array by passing convert_to_numpy=True to the execute method.

from wcps.service import Service

service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(query)
# or automatically convert the result to a numpy array
result = service.execute(query, convert_to_numpy=True)

Alternatively, the result can be saved into a file with the download method

service.download(query, output_file='tci.png')

or displayed with show, mainly for demo purposes:

service.show(query)

Note that calling this method requires the following dependencies:

  • pip install pillow - for image results
  • pip install netCDF4 - for netcdf results

Band Math

Derive an NDVI map from red and near-infrared bands of a Sentinel-2 datacube, threshold the very green areas (values greater than 0.5) as true values (white in a PNG), and save the result as a PNG image.

from wcps.service import Service
from wcps.model import Datacube, Axis

subset = [Axis("ansi", "2021-04-09"),
          Axis("E", 670000, 680000),
          Axis("N", 4990220, 5000220)]

red = Datacube("S2_L2A_32631_B04_10m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# NDVI formula
ndvi = (nir - red) / (nir + red)
# threshold NDVI values to highlight areas with high vegetation
vegetation = ndvi > 0.5
# encode final result to PNG
query = vegetation.encode("PNG")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.show(query)

Instead of explicitly writing the NDVI formula (ndvi = (nir - red) / (nir + red)), we can use the NDVI class from the wcps.spectral module. The wcps.spectral module defines classes for over 200 spectral indices based on the standardized Awesome Spectral Indices list. Given the required spectral bands, each class automatically applies the formula to compute the respective index.

from wcps.service import Service
from wcps.model import Datacube, Axis
from wcps.spectral import NDVI

subset = [Axis("ansi", "2021-04-09"),
          Axis("E", 670000, 680000),
          Axis("N", 4990220, 5000220)]

red = Datacube("S2_L2A_32631_B04_10m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# spectral index class automatically applies the formula 
ndvi = NDVI(N=nir, R=red)

vegetation = ndvi > 0.5
query = vegetation.encode("PNG")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.show(query)

Alternatively we could also use the spyndex library which supports the same indices list. Make sure to first install it with pip install spyndex pyarrow setuptools, and then we can perform the NDVI computation with:

import spyndex

ndvi = spyndex.computeIndex("NDVI", {"N": nir, "R": red})

Composites

A false-color composite can be created by providing the corresponding bands in a MultiBand object:

from wcps.service import Service
from wcps.model import Datacube, MultiBand

# defined in previous example
subset = ...

green = Datacube("S2_L2A_32631_B03_10m")[subset]
red = Datacube("S2_L2A_32631_B04_10m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# false-color composite
false_color = MultiBand({"red": nir, "green": red, "blue": green})

# scale the cell values to fit in the 0-255 range suitable for PNG
scaled = false_color / 17.0

# execute the query on the server and show the result
service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.show(scaled.encode("PNG"))

Matching Resolution / Projection

What if the bands we want to combine come from coverages with different resolutions? We can scale the bands to a common resolution before the operations, e.g. below we combine B12 from a 20m coverage, and B8 / B3 from a higher resolution 10m coverage.

from wcps.service import Service
from wcps.model import Datacube, MultiBand

# defined in previous example
subset = ...

green = Datacube("S2_L2A_32631_B03_10m")[subset]
swir = Datacube("S2_L2A_32631_B12_20m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# upscale swir to match the resolution of green
swir = swir.scale(another_coverage=green)

# false-color composite
composite = MultiBand({"red": swir, "green": nir, "blue": green})

# scale the cell values to fit in the 0-255 range suitable for PNG
scaled = composite / 17.0

# execute the query on the server and show the response
service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.show(scaled.encode("PNG"))

Matching different CRS projections can be done by reprojecting the operands to a common target CRS.

Basic Aggregation

We can calculate the average NDVI as follows:

nir = ...
red = ...
# NDVI formula
ndvi = (nir - red) / (nir + red)
# get average NDVI value
query = ndvi.avg()

service = ...
result = service.execute(query)
print(f'The average NDVI is {result.value}')

Other reduce methods include sum(), max(), min(), all(), some().

Timeseries Aggregation

A more advanced expression is the general condenser (aggregation) operation. The example calculates a map with maximum cell values across all time slices from a 3D datacube between "2015-01-01" and "2015-07-01", considering only the time slices with an average greater than 20:

from wcps.model import Datacube, AxisIter, Condense, CondenseOp
from wcps.service import Service

cov = Datacube("AvgTemperatureColorScaled")

# iterator named ansi_iter over the subset of a temporal axis ansi
ansi_iter = AxisIter("ansi_iter", "ansi") \
            .of_geo_axis(cov["ansi" : "2015-01-01" : "2015-07-01"])

max_map = (Condense(CondenseOp.MAX)
           .over( ansi_iter )
           .where( cov["ansi": ansi_iter.ref()].avg() > 20 )
           .using( cov["ansi": ansi_iter.ref()] ))

query = max_map.encode("PNG")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.download(query, 'max_map.png')

How about calculating the average of each time slice between two dates? This can be done with a coverage constructor, which will iterate over all dates between the two given dates, resulting in a 1D array of average NDVI values; notice that the slicing on the time axis ansi is done with the "iterator" variable ansi_iter like in the previous example. The 1D array is encoded as JSON in the end.

from wcps.model import Datacube, AxisIter, Coverage
from wcps.service import Service

# same as in the previous example
cov = Datacube("AvgTemperatureColorScaled")
ansi_iter = AxisIter("ansi_iter", "ansi") \
            .of_geo_axis(cov["ansi" : "2015-01-01" : "2015-07-01"])
ansi_iter_ref = ansi_iter.ref()

# compute averages per time slice
averages = Coverage("average_per_date") \
           .over( ansi_iter ) \
           .values( cov["ansi": ansi_iter_ref].Red.avg() )

query = averages.encode("JSON")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(query, query)

# print result
print(result.value)

# visualize the result as a diagram; requires:
# pip install matplotlib
import matplotlib.pyplot as plt
plt.plot(result.value, marker='o')
plt.title('Average per Date')
plt.xlabel('Date Index')
plt.ylabel('Average')
plt.show()

The returned JSON list contains only the average values, and not the datetimes to which these correspond. As a result, the "Date Index" on the X axis are just numbers from 0 to 6.

To get the date values, we can use the WCS Python Client. Make sure to install it first with pip install wcs.

from wcs.service import WebCoverageService

# get a coverage object that can be inspected for information
endpoint = "https://ows.rasdaman.org/rasdaman/ows"
wcs_service = WebCoverageService(endpoint)
cov = wcs_service.list_full_info('AvgTemperatureColorScaled')

# ansi is an irregular axis in this coverage, and we can get the
# coefficients within the subset above with the [] operator
subset_dates = cov.bbox.ansi["2015-01-01" : "2015-07-01"]

# visualize the result as a diagram
import matplotlib.pyplot as plt

plt.plot(subset_dates, result.value)
plt.title('Average per Date')
plt.xlabel('Date')
plt.ylabel('Average')
plt.show()

User-Defined Functions (UDF)

UDFs can be executed with the Udf object:

from wcps.service import Service
from wcps.model import Datacube, Udf

cov = Datacube("S2_L2A_32631_B04_10m")[
      "ansi" : "2021-04-09",
      "E" : 670000 : 680000,
      "N" : 4990220 : 5000220 ]

# Apply the image.stretch(cov) UDF to stretch the values of
# cov in the [0-255] range, so it can be encoded in JPEG
stretched = Udf('image.stretch', [cov]).encode("JPEG")

# execute the query on the server and show the result
service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.show(stretched)

Contributing

The directory structure is as follows:

  • wcps - the main library code
  • tests - testing code
  • docs - documentation in reStructuredText format

Tests

To run the tests:

# install dependencies
pip install wcps[tests]

pytest

Documentation

To build the documentation:

# install dependencies
pip install wcps[docs]

cd docs
make html

The built documentation can be found in the docs/_build/html/ subdir.

Acknowledgments

Created in project EU FAIRiCUBE, with funding from the Horizon Europe programme under grant agreement No 101059238.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wcps-0.4.1.tar.gz (51.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wcps-0.4.1-py3-none-any.whl (45.3 kB view details)

Uploaded Python 3

File details

Details for the file wcps-0.4.1.tar.gz.

File metadata

  • Download URL: wcps-0.4.1.tar.gz
  • Upload date:
  • Size: 51.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for wcps-0.4.1.tar.gz
Algorithm Hash digest
SHA256 dd498c87aa6a1a6fd4aae481de4648caa560559ca557bd9af2ee9f24b9d68f32
MD5 3997c6d972b8a69f3ea893ac3f810d8b
BLAKE2b-256 ad0a2e2adbdc3fd18cd816a59fb4669b9f681e6acd0c188c36581a7b5cebe65b

See more details on using hashes here.

Provenance

The following attestation bundles were made for wcps-0.4.1.tar.gz:

Publisher: release.yml on rasdaman/wcps-python-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wcps-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: wcps-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 45.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for wcps-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ef454c66ea9b35f4821a1c44120ae31427a03eafb2daeeaea457e18e3577bdc6
MD5 cc34522b5a0b05f3ab62068e7c5ba24a
BLAKE2b-256 46a589138dd15efc1ca324c59eb1f7a0657d9f00e230282c5e1313ae1fa217b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for wcps-0.4.1-py3-none-any.whl:

Publisher: release.yml on rasdaman/wcps-python-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page