Skip to main content

Python client library for WCPS (OGC Web Coverage Processing Service) backends.

Project description

Overview

The OGC Web Coverage Processing Service (WCPS) standard defines a protocol-independent declarative query language for the extraction, processing, and analysis of multi-dimensional coverages (datacubes) representing sensor, image, or statistics data.

This Python library allows to dynamically build WCPS queries and execute on a WCPS server. To query a WCS server for information on available data, check the WCS Python Client.

Installation

pip install wcps

Examples

Subsetting

Extracting spatio-temporal can be done with the subscripting operator [], by specifying lower and upper bounds for the axes we want to trim, or a single bound to slice the axis at a particular index.

from wcps.service import Service
from wcps.model import Datacube

# Slice the time axis (with name ansi) at "2021-04-09",
# and trim on the spatial axes
cov = Datacube("S2_L2A_32631_TCI_60m")[
      "ansi" : "2021-04-09",
      "E" : 669960 : 700000,
      "N" : 4990200 : 5015220 ]

# encode final result to JPEG
query = cov.encode("JPEG")

# execute the query on the server and get back a WCPSResult
service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(query)

# show the returned image; requires to install pillow:
# pip install pillow
from PIL import Image
from io import BytesIO
Image.open(BytesIO(result.value)).show()

# alternatively, save the content of the response into a file
service.download(query, output_file='tci.png')

Band Math

Derive an NDVI map from red and near-infrared bands of a Sentinel-2 datacube, threshold the very green areas (values greater than 0.5) as true values (white in a PNG), and save the result as a PNG image.

from wcps.service import Service
from wcps.model import Datacube, Axis

subset = [Axis("ansi", "2021-04-09"),
          Axis("E", 670000, 680000),
          Axis("N", 4990220, 5000220)]

red = Datacube("S2_L2A_32631_B04_10m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# NDVI formula
ndvi = (nir - red) / (nir + red)
# threshold NDVI values to highlight areas with high vegetation
vegetation = ndvi > 0.5
# encode final result to PNG
query = vegetation.encode("PNG")

# execute the query on the server and get back the response
service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(query)

# show the returned image; requires to install pillow:
# pip install pillow
from PIL import Image
from io import BytesIO
Image.open(BytesIO(result.value)).show()

# similar to above, but automatically convert the PNG result 
# to a numpy array
result = service.execute(query, convert_to_numpy=True)

Composites

A false-color composite can be created by providing the corresponding bands in a MultiBand object:

from wcps.service import Service
from wcps.model import Datacube, MultiBand

# defined in previous example
subset = ...

green = Datacube("S2_L2A_32631_B03_10m")[subset]
red = Datacube("S2_L2A_32631_B04_10m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# false-color composite
false_color = MultiBand({"red": nir, "green": red, "blue": green})

# scale the cell values to fit in the 0-255 range suitable for PNG
scaled = false_color / 17.0

# execute the query on the server and get back the response
service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(scaled.encode("PNG"))

# show the returned image; requires to install pillow:
# pip install pillow
from PIL import Image
from io import BytesIO
Image.open(BytesIO(result.value)).show()

Matching Resolution / Projection

What if the bands we want to combine come from coverages with different resolutions? We can scale the bands to a common resolution before the operations, e.g. below we combine B12 from a 20m coverage, and B8 / B3 from a higher resolution 10m coverage.

from wcps.service import Service
from wcps.model import Datacube, MultiBand

# defined in previous example
subset = ...

green = Datacube("S2_L2A_32631_B03_10m")[subset]
swir = Datacube("S2_L2A_32631_B12_20m")[subset]
nir = Datacube("S2_L2A_32631_B08_10m")[subset]

# upscale swir to match the resolution of green
swir = swir.scale(another_coverage=green)

# false-color composite
composite = MultiBand({"red": swir, "green": nir, "blue": green})

# scale the cell values to fit in the 0-255 range suitable for PNG
scaled = composite / 17.0

# execute the query on the server and get back the response
service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(scaled.encode("PNG"))

# show the returned image; requires to install pillow:
# pip install pillow
from PIL import Image
from io import BytesIO
Image.open(BytesIO(result.value)).show()

Matching different CRS projections can be done by reprojecting the operands to a common target CRS.

Basic Aggregation

We can calculate the average NDVI as follows:

nir = ...
red = ...
# NDVI formula
ndvi = (nir - red) / (nir + red)
# get average NDVI value
query = ndvi.avg()

service = ...
result = service.execute(query)
print(f'The average NDVI is {result.value}')

Other reduce methods include sum(), max(), min(), all(), some().

Timeseries Aggregation

A more advanced expression is the general condenser (aggregation) operation. The example calculates a map with maximum cell values across all time slices from a 3D datacube between "2015-01-01" and "2015-07-01", considering only the time slices with an average greater than 20:

from wcps.model import Datacube, AxisIter, Condense, CondenseOp
from wcps.service import Service

cov = Datacube("AvgTemperatureColorScaled")

# iterator named ansi_iter over the subset of a temporal axis ansi
ansi_iter = AxisIter("ansi_iter", "ansi") \
            .of_geo_axis(cov["ansi" : "2015-01-01" : "2015-07-01"])

max_map = (Condense(CondenseOp.MAX)
           .over( ansi_iter )
           .where( cov["ansi": ansi_iter.ref()].avg() > 20 )
           .using( cov["ansi": ansi_iter.ref()] ))

query = max_map.encode("PNG")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
service.download(query, 'max_map.png')

How about calculating the average of each time slice between two dates? This can be done with a coverage constructor, which will iterate over all dates between the two given dates, resulting in a 1D array of average NDVI values; notice that the slicing on the time axis ansi is done with the "iterator" variable ansi_iter like in the previous example. The 1D array is encoded as JSON in the end.

from wcps.model import Datacube, AxisIter, Coverage
from wcps.service import Service

# same as in the previous example
cov = Datacube("AvgTemperatureColorScaled")
ansi_iter = AxisIter("ansi_iter", "ansi") \
            .of_geo_axis(cov["ansi" : "2015-01-01" : "2015-07-01"])
ansi_iter_ref = ansi_iter.ref()

# compute averages per time slice
averages = Coverage("average_per_date") \
           .over( ansi_iter ) \
           .values( cov["ansi": ansi_iter_ref].Red.avg() )

query = averages.encode("JSON")

service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(query, query)

# print result
print(result.value)

# visualize the result as a diagram; requires:
# pip install matplotlib
import matplotlib.pyplot as plt
plt.plot(result.value, marker='o')
plt.title('Average per Date')
plt.xlabel('Date Index')
plt.ylabel('Average')
plt.show()

The returned JSON list contains only the average values, and not the datetimes to which these correspond. As a result, the "Date Index" on the X axis are just numbers from 0 to 6.

To get the date values, we can use the WCS Python Client. Make sure to install it first with pip install wcs.

from wcs.service import WebCoverageService

# get a coverage object that can be inspected for information
endpoint = "https://ows.rasdaman.org/rasdaman/ows"
wcs_service = WebCoverageService(endpoint)
cov = wcs_service.list_full_info('AvgTemperatureColorScaled')

# ansi is an irregular axis in this coverage, and we can get the
# coefficients within the subset above with the [] operator
subset_dates = cov.bbox.ansi["2015-01-01" : "2015-07-01"]

# visualize the result as a diagram
import matplotlib.pyplot as plt

plt.plot(subset_dates, result.value)
plt.title('Average per Date')
plt.xlabel('Date')
plt.ylabel('Average')
plt.show()

User-Defined Functions (UDF)

UDFs can be executed with the Udf object:

from wcps.service import Service
from wcps.model import Datacube, Udf

cov = Datacube("S2_L2A_32631_B04_10m")[
      "ansi" : "2021-04-09",
      "E" : 670000 : 680000,
      "N" : 4990220 : 5000220 ]

# Apply the image.stretch(cov) UDF to stretch the values of
# cov in the [0-255] range, so it can be encoded in JPEG
stretched = Udf('image.stretch', [cov]).encode("JPEG")

# execute the query on the server and get back a WCPSResult
service = Service("https://ows.rasdaman.org/rasdaman/ows")
result = service.execute(stretched)

# show the returned image
from PIL import Image
from io import BytesIO
Image.open(BytesIO(result.value)).show()

Contributing

The directory structure is as follows:

  • wcps - the main library code
  • tests - testing code
  • docs - documentation in reStructuredText format

Tests

To run the tests:

# install dependencies
pip install wcps[tests]

pytest

Documentation

To build the documentation:

# install dependencies
pip install wcps[docs]

cd docs
make html

The built documentation can be found in the docs/_build/html/ subdir.

Acknowledgments

Created in project EU FAIRiCUBE, with funding from the Horizon Europe programme under grant agreement No 101059238.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wcps-0.2.3.tar.gz (32.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wcps-0.2.3-py3-none-any.whl (26.2 kB view details)

Uploaded Python 3

File details

Details for the file wcps-0.2.3.tar.gz.

File metadata

  • Download URL: wcps-0.2.3.tar.gz
  • Upload date:
  • Size: 32.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for wcps-0.2.3.tar.gz
Algorithm Hash digest
SHA256 341edceefc6ddc53d7e72546b911e6123fa4ac92cd88e8466aa502d0db606c38
MD5 fb7a795859153f8024be0a2f94712176
BLAKE2b-256 74629c0865386894226ed5d44190210ba570b7cc67836a5a5418d86e7fac4142

See more details on using hashes here.

Provenance

The following attestation bundles were made for wcps-0.2.3.tar.gz:

Publisher: release.yml on rasdaman/wcps-python-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wcps-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: wcps-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 26.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for wcps-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 647148fb410efa2c147a630cf0376bf6853c3bd3644024e0bbf1e6f2fbfd47bd
MD5 08ce2b36366dd4be62041db40298cda5
BLAKE2b-256 e870292571ce1f3a4b32c45d72a810c297492e8a844d2f4d6a1ad8aa775fb9a7

See more details on using hashes here.

Provenance

The following attestation bundles were made for wcps-0.2.3-py3-none-any.whl:

Publisher: release.yml on rasdaman/wcps-python-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page