Skip to main content

No project description provided

Project description

General Description

  • Purpose: Retrieve CMEMS sea-surface height forecasts at the coast (EU-wide) and add other relevant processes to produce a coastal total water-level TWL, and evaluate the TWLs against pre-defined flood-triggering thresholds.
  • Outputs: -Netcdf files based on the bulleting date of the execution day (t0) containing coastal time-series for the selected product-region (ARC,MED,IBI,BS,BAL,NWS) for 7 days [t0-2 : t0+5] (dimensions Ncoastalpoints x times) -a csv file containing the triggering information for each coastal point as defined by the pre-defined target coastal points (e.g. the hindcast in the case of ECFAS).

Installation

We recommend using miniconda.

Production

$ conda env create -n ecfas python'>=3.10'
$ conda install -c fbriol pyfes
$ conda install -c conda-forge geopandas
$ conda install -c conda-forge proj
$ conda install -c conda-forge geos
$ conda install -c conda-forge cartopy 
$ conda install -c conda-forge xarray dask netCDF4 bottleneck
$ conda activate ecfas
$ pip install ecfas

Development

$ git clone git@gitlab.mercator-ocean.fr:mirazoki/ecfas.git
$ cd ecfas
$ conda env create -f conda-env.yml
$ conda activate ecfas
$ pip install -e .

External dependencies

Create a configuration file

Create a file with the following content:

# CMEMS credentials and url for data donload
usr=
pwd=
url=http://nrt.cmems-du.eu/motu-web/Motu
# Directory for outputs
outdir=
# Directory for masks
maskdir=
# Leave blank if do not want any log files (console output only)
logdir=
# FES data, if blank then there is none
fesdir=

Directories paths can be absolute or relative. If relative, they will assumed to be relative to the scripts' running directory.

Usage

The package currently comprises two commands: -op_workflow: function that generates the coastal TWL netcdf series based on the CMEMS products. -op_trigger: function that evaluates the TWLs generated in op_workflow againt pre-defined target coastal points and their correponding thresholds for triggering and extreme event duration determination. The output is a csv file with the trigger information, defined on the target coastal points

For ECFAS, the commands should be run sequentially, first the workflow and then the trigger. The workflow can be run independently (and hence in parallel) for each regional domain, while the trigger should be run once the workflow has run for all domains. A description of how to run each of the commands is provided below.

Running the workflow (op_workflow)

  • User guide: The workflow is run separately for each regional domain in Europe, namely NWS,IBI,MED,BAL,BS,ARC (see optional argument -r) For operational purposes (e.g. ECFAS), the workflow should be scheduled at the corresponding daily forecast update time for each domain:

    • NWS: North West Shelf, daily update time: 12:00

    • IBI: Iberian Biscay and Ireland , daily update time: 14:00

    • MED: Mediterranean Sea , daily update time: 20:00

    • BAL: Baltic Sea , daily update time: 22:00

    • BS: Black Sea , daily update time: 12:00

    • ARC: Arctic , daily update time: 04:00

The workflow needs as a minimum the configuration file to run. The optional arguments are the following:

-r <region> : Region, matching the 6 Copernicus Marine Service regional domains (see User guide). Default: NWS

-t <%Y%m%d_%H%M%S>: Bulleting date for the forecast data. Default: Forecast update time of execution day
  • Usage: op_workflow -c <config_file> [-r <region>] [-t <%Y%m%d_%H%M%S>] [--reanal] [--debug]

Example call: op_workflow -c ecfas.cnf -r NWS -t 20220125_000000

The debug flag will notably prevent cleaning up of previously downloaded files (which is the default) in order to speed up debugging process.

There are some particularities to 2 of the domains: -For BS water-levels,the FES2014 tides are added because tides are lacking in the CMEMS model -For ARC water-levels, the ocean product in the CMEMS catalogue (ARCTIC_ANALYSIS_FORECAST_PHYS_002_001_A) and the tide and surge model (ARCTIC_ANALYSISFORECAST_PHY_TIDE_002_015) are added together. Some double-counting is expected.

Note: this will access the analysis not the forecast if a date is in the past

  • Output: Netcdf files based on the bulleting date of the execution day (t0) containing coastal time-series for the selected product-region (ARC,MED,IBI,BS,BAL,NWS) for 7 days [t0-2 : t0+5] (dimensions Ncoastalpoints x times)

Worflow description

Functions called within main, in this order:

  1. motu_download.py: Download fields from CMEMS DU given selected region, timeframe and bulletin date>> CMEMS daily fields to $region/data/*.nc
  2. coast_water_level_extract_multiple.py : For the given timeframe [t0-2 : t0+5] (=[tini,tend]), snip fields to prescribed coastal locations and interpolate all variables to common location-times >>CMEMS coastal series to $region/data/tseries_coastal_$bulletindate_$tini_$tend.nc
  3. coast_water_level_process.py: Read the time-series and add other releveant coastal WL contributions (tide if not present, wave setup), write out in daily files >> TWL coastal series to $region/timeseries/TScoast_$region_b$bulletindate_$tini_$tend.nc

The files under $region/timeseries/ are the coastal TWL forecasts. These are used in ECFAS to trigger the warning and mapping component of the system.

Running the trigger (op_trigger)

  • User guide: The trigger is run for all domain-folders found in the output directory as defined by the configuration file. For ECFAS, it should be run after the workflow (op_workflow) has been run for all regions NWS,IBI,MED,BAL,BS,ARC.

The trigger needs as a minimum the configuration file to run, which is the same as used for the workflow (op_workflow). From this configuration file, only the output directory is retreieved as info for op_trigger.

The optional arguments are the following: -t <%Y%m%d_%H%M%S>: Bulleting date for the forecast data. Default: Forecast update time of execution day (same as in op_workflow, check the details for this function in the README)

  • Usage: op_trigger -c <config_file> [-t <%Y%m%d_%H%M%S>]

Example call: op_trigger -c ecfas.cnf -t 20220125_000000

  • Output: csv file (trigger/Trigg_info.csv) with all relevant information for coastal flood triggering at the prescribed target coastal points

Trigger description

Functions called within main, in this order:

  1. coast_water_level_trigger.py : For the coastal TWLs produced in op_workflow, collect all regional coastal series and produce trigger information in the form of a csv file.
  • input files: pre-defined csv files inside ecfas/thresholds/ containing the target coastal locations and thresholds for triggering and duration.

  • output files: $outdir/trigger/Trigg_info.csv. Csv file containing for each target coastal point the following information: $outdir corresponds to the parent directory containing all output generated in op_workflow (and prescribed in the configuration file)

    • Lon : hindcast longitude • Lat : hindcast latitude • lon_map : forecast longitude • lat_map : forecast latitude • map_id: forecast coastal station name • dist: distance between hindcast and assigned forecast point • thr1: triggering threshold • thr2: duration threshold • flag: triggered YES/NO • fhours: number of hourly points exceeding the duration threshold • maxwl: maximum WL over the forecast • maxwlt: time of max WL • ffirst: first value above triggering threshold

GeoJson outputs

Conversion of the ecfas nc coastal TWL time-series to geojson format, with the addition of the trigger threshold for coastal flood warning. It therefore requires that the ecfas workflow and trigger have been run first such. If this is not the case, a dummy threshold is introduced (-9999).

Example usage: nc2geojson -o [-r ] -t [<%Y%m%d_%H%M%S>]

Where:

-o : Output directory where the daily regional netcdf time-series forecasts are saved. Optional arguments:

-r : Region, matching the 6 Copernicus Marine Service regional domains. Defaults to all ['NWS','IBI','MED','BAL','BS','ARC']

-t <%Y%m%d_%H%M%S>: Bulletin date for the forecast data. Default to forecast update time of execution day

Output: output written to folder <output_dir>//timeseries/geojson. The outputs are geojson files (*.json). The name of the file is kept the same as the the original netcdf filenames, just the extension changes.

Test data and checks

Baselines for tests are included in the GitLab repository in the baselines directory, and are managed with git-lfs

Quality checks:

  1. Verification of workflow output (op_workflow) against baseline data:
qual_checks [-h] -o <output_dir> -b <baseline_dir> -r <region> -t <YYmmdd_HHMMSS>

Process input arguments:
  -o <output_dir>, --outputs <output_dir>
                        Absolute path to output data to be checked
  -b <baseline_dir>, --baselines <baseline_dir>
                        Absolute path to baseline data to be checked against

optional arguments:
  -h, --help            show this help message and exit

  -r <region>, --region <region>
                        Region of interest, one of ARC, BAL, BS. IBI, MED, NWS. Defaults to all
  -t <YYmmdd_HHMMSS>, --t0 <YYmmdd_HHMMSS>
                        Start time t0 in the format YYmmdd_HHMMSS
  1. Verification of trigger output (op_trigger) against baseline data:
check_trigger [-h] -o <output_dir> -b <baseline_dir>
Process input arguments.
  -o <output_dir>, --outputs <output_dir>
                        Absolute path to output data to be checked
  -b <baseline_dir>, --baselines <baseline_dir>
                        Absolute path to baseline data to be checked against
optional arguments:
  -h, --help            show this help message and exit
  1. Validation of the netcdf time-series resulting from the workflow (op_workflow)
op_validate [-h] -o <output_dir> -r <region> [-s <Y-m-d H-M-S>] [-e <Y-m-d H-M-S>]

Process input arguments.

optional arguments:
  -h, --help            show this help message and exit
  -o <output_dir>, --outputs <output_dir>
                        Absolute path to output data to be checked
  -r <region>, --region <region>
                        Region of interest, one of ARC, BAL, BS. IBI, MED, NWS, GLO. Defaults to all
  -s <Y-m-d H-M-S>, --t-start <Y-m-d H-M-S>
                        Start time in the format Y-m-d H-M-S
  -e <Y-m-d H-M-S>, --t-end <Y-m-d H-M-S>
                        End time in the format Y-m-d H-M-S

Running unit and functional tests

Unit and functional tests are found in the test and functional_test directories respectively

To run the unit and functional tests pip install pytest, py-cov. Then - for example - run

pytest -v -s --log-cli-level=INFO test/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ecfas-1.0.20.tar.gz (3.5 MB view details)

Uploaded Source

Built Distribution

ecfas-1.0.20-py2.py3-none-any.whl (3.5 MB view details)

Uploaded Python 2 Python 3

File details

Details for the file ecfas-1.0.20.tar.gz.

File metadata

  • Download URL: ecfas-1.0.20.tar.gz
  • Upload date:
  • Size: 3.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.11.3 pkginfo/1.8.2 requests/2.28.1 requests-toolbelt/0.9.1 tqdm/4.64.0 CPython/3.10.4

File hashes

Hashes for ecfas-1.0.20.tar.gz
Algorithm Hash digest
SHA256 bc9a08e52729e2741c6c28fce0ce51fce052c89775030dd26405406a1db639e0
MD5 5b9c7fb284e509539eb649a3bdfd77a6
BLAKE2b-256 56f3e7e8a1e4c0d94de224e3d33da59c9a8914926b5d495ea6bbde058d268603

See more details on using hashes here.

File details

Details for the file ecfas-1.0.20-py2.py3-none-any.whl.

File metadata

  • Download URL: ecfas-1.0.20-py2.py3-none-any.whl
  • Upload date:
  • Size: 3.5 MB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.11.3 pkginfo/1.8.2 requests/2.28.1 requests-toolbelt/0.9.1 tqdm/4.64.0 CPython/3.10.4

File hashes

Hashes for ecfas-1.0.20-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 3215de13ebed50250ee26b45bfbeb7de20cb0e36d951e6bac9cb208b2aeb573d
MD5 91a96188f0d660da2fee4903bdefdcbb
BLAKE2b-256 ea1b088f93b649454de1946c43a15c5beda86feca10c6d813528137ef85a3fea

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page