No project description provided
Project description
openEO process graph to evalscript converter
This repository contains a library for converting openEO process graphs to Sentinel Hub evalscripts.
The motivation behind this library is to reduce the data transfer between the SH backend and the openEO backend and to move part of the processing directly to backend where the data is stored.
API
convert_from_process_graph
convert_from_process_graph(
process_graph,
n_output_bands=1,
sample_type="FLOAT32",
units=None,
bands_dimension_name="bands",
temporal_dimension_name="t",
bands_metadata=[]
)
Parameters
-
process_graph
: dictOpenEO process graph JSON as Python dict object.
-
n_output_bands
: int, optional. Default:1
Number of output bands in the evalscript. This can be set if the value is known beforehand. See docs.
-
sample_type
: str, optional. Default:FLOAT32
Desired
sampleType
of the output raster. See possible values. -
units
: str, optional. Default:None
Units used by all the bands in the evalscript. If
None
,units
evalscript parameter isn't set and default units for each band are used. See docs. -
bands_dimension_name
: str, optional. Default:bands
Name of the default dimension of type
bands
of the datacube, as set inload_collection
and referred to in the openEO process graph. -
temporal_dimension_name
: str, optional. Default:t
Name of the default dimension of type
temporal
of the datacube, as set inload_collection
and referred to in the openEO process graph. -
bands_metadata
: list, optional. Default:[]
List of metadata information for all bands of a certain collection.
-
encode_result
: bool, optional. Default:True
Should the result of the evalscript be encoded with the dimensions of the data or returned as is.
Output
-
evalscripts
: list of dictsReturns a list of dicts containing the
Evalscript
objects. Every element consists of:-
invalid_node_id
: Id of the first invalid node after a supported subgraph. The output of the associated evalscript should be the input of the node. If it isNone
, the entire graph is valid. -
evalscript
: instance of andEvalscript
object that generates an evalscript for a valid subgraph fromload_collection
to the node with idinvalid_node_id
.
-
Evalscript
Constructor parameters
-
input_bands
: listList of bands to be imported. See docs.
-
nodes
: listList of
Node
objects that constitute the valid process (sub)graph. -
initial_data_name
: strThe id of the initial
load_collection
node that loads the data. -
n_output_bands
: int, optional. Default:1
Number of output bands in the evalscript. This can be set if the value is known beforehand. See docs.
-
sample_type
: str, optional. Default:FLOAT32
Desired
sampleType
of the output raster. See possible values. -
units
: str, optional. Default:None
Units used by all the bands in the evalscript. If
None
,units
evalscript parameter isn't set and default units for each band are used. See docs. -
mosaicking
: str, optional. Default:ORBIT
Works with multi-temporal data by default. See possible values.
-
bands_dimension_name
: str, optional. Default:bands
Name of the default dimension of type
bands
of the datacube, as set inload_collection
and referred to in the openEO process graph. -
temporal_dimension_name
: str, optional. Default:t
Name of the default dimension of type
temporal
of the datacube, as set inload_collection
and referred to in the openEO process graph. -
datacube_definition_directory
: str, optional. Default:javascript_datacube
Relative path to the directory with the javascript implemetations of the processes.
-
output_dimensions
: list of dicts or None, optional. Default:None
Information about the dimensions in the output datacube. This can be set if the value is known beforehand. Each element contains:
name
: name of the dimension,size
: size (length) of the dimension,original_temporal
, optional: boolean, should beTrue
if this is the temporal dimension generated in the initialload_collection
node.
-
encode_result
: bool, optional. Default:True
Should the result of the evalscript be encoded with the dimensions of the data or returned as is.
-
bands_metadata
: list, optional. Default:[]
List of metadata information for all bands of a certain collection.
Methods
-
write()
:Returns the evalscript as a string.
-
determine_output_dimensions()
:Calculates the greatest possible dimensions of the output datacube, returning a list of dicts. Each element contains:
name
: name of the dimension,size
: size (length) of the dimension,original_temporal
, optional: boolean,True
if this is the temporal dimension generated in the initialload_collection
node.
-
set_output_dimensions(output_dimensions)
:Setter for output dimensions.
output_dimensions
is a list of dicts. Each element contains:name
: name of the dimension,size
: size (length) of the dimension,original_temporal
, optional: boolean, should beTrue
if this is the temporal dimension generated in the initialload_collection
node.
-
set_input_bands(input_bands)
:Setter for input bands.
input_bands
is an array of strings (band names) orNone
. Output dimensions are recalculated. -
get_decoding_function()
:Returns a
decode_data
function. The data returned by the evalscript is encoded to contain the information about the datacube dimensions and has to be decoded to obtain the actual data in a ndarray format.decode_data
has the following parameters:data
: the result of processing of the associated evalscript, it should be a three-dimensional array.decode_data
returns a multidimensional Python list.
list_supported_processes
Returns a list of process ids of supported openEO processes.
Workflow
- Construct the openEO process graph
Load a file with json.load
or generate an openEO process graph using openEO Python client.
- Run the conversion
subgraph_evalscripts = convert_from_process_graph(process_graph)
print(subgraph_evalscripts)
>>> [{'evalscript': <pg_to_evalscript.evalscript.Evalscript object at 0x000001ABA779CA00>, 'invalid_node_id': None}]
In this example, the entire openEO process graph could be converted to an evalscript, so we only have one entry.
- Fetch the data
evalscript = subgraph_evalscripts[0]['evalscript'].write()
print(evalscript)
>>> "//VERSION=3 function setup(){ ..."
The evalscript string can now be used to process data on Sentinel Hub. Sentinel Hub Python client makes it easy to do so.
- Decode the fetched data
# Get the decoding function fo r this evalsscript
decoding_function = evalscript.get_decoding_function()
# Pass the fetched data through the decoding function.
# The function expects a python list. If you're using Sentinel Hub Python client, the result might be a numpy array, so it has to be converted.
decoded_data = decoding_function(fetched_data.tolist())
print(decoded_data)
>>> [[[1, 2, 3], [4, 5, 6], ... ]]
Running notebooks
pipenv install --dev --pre
Start the notebooks
cd notebooks
pipenv run jupyter notebook
Running tests
Using Docker
docker build -t tests .
docker run tests
Directly
Tests require a NodeJS environment.
pipenv install
pipenv shell
cd tests
pytest
Linting
pipenv run black -l 120 .
Benchmark
pipenv shell
cd tests
python benchmark.py
Developing
Install the package in editable mode so the changes take effect immediately.
pipenv install -e .
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pg_to_evalscript-0.2.2rc2.tar.gz
.
File metadata
- Download URL: pg_to_evalscript-0.2.2rc2.tar.gz
- Upload date:
- Size: 36.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7f8edda46bb34aec9c6ef65e10841fc976160c0f9a84dbd2636bec89f36959fa |
|
MD5 | 00146989b1e747e9d14f8904c02519d0 |
|
BLAKE2b-256 | c85f1ef18e11604a7d1d2d1e05fa673c45ef2e420a30b3fbfbd29a14051be791 |
File details
Details for the file pg_to_evalscript-0.2.2rc2-py3-none-any.whl
.
File metadata
- Download URL: pg_to_evalscript-0.2.2rc2-py3-none-any.whl
- Upload date:
- Size: 68.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b0ab7358cbfa1fcaffed6a09b6c286ff83d33c8c633586a60b6edce60e25ad6b |
|
MD5 | 80701391254316e2eda90421b6af3f56 |
|
BLAKE2b-256 | e47cfba673a40a2434406094fb81b6a25be9270dbb1dac1619c781044b65a40b |