Skip to main content

Tool for aggregating raw NWP files into .zarr files

Project description

NWP CONSUMER

Microservice for consuming NWP data.


A microservice for multi-source consumption of NWP data, storing it in a common format. Built with inspiration from the Hexagonal Architecture pattern, the nwp-consumer is currently packaged with adapters for pulling and converting .grib data from:

Similarly, the service can write to multiple sinks:

Its modular nature enables straightforward extension to alternate future sources.

Running the service

The service uses environment variables to configure sources and sinks in accordance with the Twelve-Factor App methodology. The program will inform you of missing env vars when using an adaptor, but you can also check the config for the given module, or use the env command.

Using Docker

This service is designed to be run as a Docker container. The Containerfile is the Dockerfile for the service. It is recommended to run it this way due to the dependency on external non-python binaries, which at the moment cannot be easily distributed in a PyPi package. To run, pull the latest version from ghcr.io via:

$ docker run \
  -v /path/to/datadir:/data \
  -e ENV_VAR=<value> \
  ghcr.io/openclimatefix/nwp-consumer:latest <command...>  

Using the Python Package

Ensure the external dependencies are installed. Then, do one of the following:

Either

  • Install from PyPI with
    $ pip install nwp-consumer
    

or

  • Clone the repository and install the package via
    $ git clone git@github.com:openclimatefix/nwp-consumer.git
    $ cd nwp-consumer
    $ pip install .
    

Then run the service via

$ ENV_VAR=<value> nwp-consumer <command...> 

CLI

Whether running via Docker or the Python package, available commands can be found with the command help or the --help flag. For example:

$ nwp-consumer --help
# or
$ docker run ghcr.io/openclimatefix/nwp-consumer:latest --help

Ubiquitous Language

The following terms are used throughout the codebase and documentation. They are defined here to avoid ambiguity.

  • InitTime - The time at which a forecast is initialised. For example, a forecast initialised at 12:00 on 1st January.

  • TargetTime - The time at which a predicted value is valid. For example, a forecast with InitTime 12:00 on 1st January predicts that the temperature at TargetTime 12:00 on 2nd January at position x will be 10 degrees.

Repository structure

Produced using exa:

$ exa --tree --git-ignore -F -I "*init*|test*.*"
./
├── Containerfile # The Dockerfile for the service
├── pyproject.toml # The build configuration for the service
├── README.md
└── src/
   ├── nwp_consumer/ # The main library package
   │  ├── cmd/
   │  │  └── main.py # The entrypoint to the service
   │  └── internal/ # Packages internal to the service. Like the 'lib' folder
   │     ├── config/ 
   │     │  └── config.py # Contains the configuration specification for running the service
   │     ├── inputs/ # Holds subpackages for each incoming data source
   │     │  ├── ceda/
   │     │  │  ├── _models.py
   │     │  │  ├── client.py # Contains the client and functions to map CEDA data to the service model
   │     │  │  └── README.md # Info about the CEDA data source
   │     │  └── metoffice/
   │     │     ├── _models.py
   │     │     ├── client.py # # Contains the client and functions to map MetOffice data to the service model
   │     │     └── README.md # Info about the MetOffice data source
   │     ├── models.py # Describes the internal data models for the service
   │     ├── outputs/ # Holds subpackages for each data sink
   │     │  ├── localfs/
   │     │  │  └── client.py # Contains the client for storing data on the local filesystem
   │     │  └── s3/
   │     │     └── client.py # Contains the client for storing data on S3
   │     └── service/ # Contains the business logic and use-cases of the application
   │        └── service.py # Defines the service class for the application, whose methods are the use-cases
   └── test_integration/

nwp-consumer is structured following principles from the hexagonal architecture pattern. In brief, this means a clear separation between the application's business logic - it's Core - and the Actors that are external to it. In this package, the core of the service is in internal/service/ and the actors are in internal/inputs/ and internal/outputs/. The service logic has no knowledge of the external actors, instead defining interfaces that the actors must implement. These are found in internal/models.py. The actors are then responsible for implementing these interfaces, and are dependency-injected in at runtime. This allows the service to be easily tested and extended. See further reading for more information.

Local development

Clone the repository, and create and activate a new python virtualenv for it. cd to the repository root.

Install the External and Python dependencies as shown in the sections below.

Taskfile

This repository bundles often used commands into a taskfile for convenience. To use these commands, ensure go-task is installed, easily done via homebrew.

You can then see the available tasks using

$ task -l

External dependencies

The cfgrib python library depends on the ECMWF cfgrib binary, which is a wrapper around the ECMWF ecCodes library. One of these must be installed on the system and accessible as a shared library.

On a MacOS with HomeBrew use

$ brew install eccodes

Or if you manage binary packages with Conda use

$ conda install -c conda-forge cfgrib

As an alternative you may install the official source distribution by following the instructions at https://confluence.ecmwf.int/display/ECC/ecCodes+installation

You may run a simple selfcheck command to ensure that your system is set up correctly:

$ python -m <eccodes OR cfgrib> selfcheck
Found: ecCodes v2.27.0.
Your system is ready.

Python requirements

Install the required python dependencies and make it editable with

$ pip install -e . 

or use the taskfile

$ task install

This looks for requirements specified in the pyproject.toml file.

Note that these are the bare dependencies for running the application. If you want to run tests, you need the development dependencies as well, which can be installed via

$ pip install -e ".[dev]"

or

$ task install-dev
Where is the requirements.txt file?

There is no requirements.txt file. Instead, the project uses setuptool's pyproject.toml integration to specify dependencies. This is a new feature of setuptools and pip, and is the recommended way to specify dependencies. See the setuptools guide and the PEP621 specification for more information, as well as Further Reading.

Running tests

Ensure you have installed the Python requirements and the External dependencies.

Run the unit tests with

$ python -m unittest discover -s src/nwp_consumer -p "test_*.py"

or

$ task test-unit

and the integration tests with

$ python -m unittest discover -s test_integration -p "test_*.py"

or

$ task test-integration

See further reading for more information on the src directory structure.


Further reading

On packaging a python project using setuptools and pyproject.toml:

On hexagonal architecture:

On the directory structure:


Contributing and community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

nwp_consumer-0.5.33-py3-none-any.whl (94.2 kB view details)

Uploaded Python 3

File details

Details for the file nwp_consumer-0.5.33-py3-none-any.whl.

File metadata

File hashes

Hashes for nwp_consumer-0.5.33-py3-none-any.whl
Algorithm Hash digest
SHA256 638ad855bf33d05ba8ead55a9eaa7a345fe65a37131e41c93015a9ebc45ab172
MD5 82f4526687de2e44008da2198e05d69d
BLAKE2b-256 44f44178a54dc6473fab17c570a106da901531ba4eeac54da5a7205c8dbabc84

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page