PVNet
Project description
PVNet
This project is used for training PVNet and running PVNet on live data.
PVNet is a multi-modal late-fusion model for predicting renewable energy generation from weather data. The NWP (Numerical Weather Prediction) and satellite data are sent through a neural network which encodes them down to 1D intermediate representations. These are concatenated together with recent generation, the calculated solar coordinates (azimuth and elevation) and the location ID which has been put through an embedding layer. This 1D concatenated feature vector is put through an output network which outputs predictions of the future energy yield.
Experiments
Our paper based on this repo was accepted into the Tackling Climate Change with Machine Learning workshop at ICLR 2024 and can be viewed here.
Some more structured notes on experiments we have performed with PVNet are here.
Setup / Installation
git clone git@github.com:openclimatefix/PVNet.git
cd PVNet
pip install .
The commit history is extensive. To save download time, use a depth of 1:
git clone --depth 1 git@github.com:openclimatefix/PVNet.git
This means only the latest commit and its associated files will be downloaded.
Next, in the PVNet repo, install PVNet as an editable package:
pip install -e .
Additional development dependencies
pip install ".[dev]"
Getting started with running PVNet
Before running any code in PVNet, copy the example configuration to a configs directory:
cp -r configs.example configs
You will be making local amendments to these configs. See the README in
configs.example for more info.
Datasets
As a minimum, in order to create samples of data/run PVNet, you will need to supply paths to NWP and GSP data. PV data can also be used. We list some suggested locations for downloading such datasets below:
GSP (Grid Supply Point) - Regional PV generation data
The University of Sheffield provides API access to download this data:
https://www.solar.sheffield.ac.uk/api/
Documentation for querying generation data aggregated by GSP region can be found here: https://docs.google.com/document/d/e/2PACX-1vSDFb-6dJ2kIFZnsl-pBQvcH4inNQCA4lYL9cwo80bEHQeTK8fONLOgDf6Wm4ze_fxonqK3EVBVoAIz/pub#h.9d97iox3wzmd
NWP (Numerical weather predictions)
OCF maintains a Zarr formatted version of the German Weather Service's (DWD)
ICON-EU NWP model here:
https://huggingface.co/datasets/openclimatefix/dwd-icon-eu which includes the UK
PV
OCF maintains a dataset of PV generation from 1311 private PV installations
here: https://huggingface.co/datasets/openclimatefix/uk_pv
Connecting with ocf-data-sampler for sample creation
Outside the PVNet repo, clone the ocf-data-sampler repo and exit the conda env created for PVNet: https://github.com/openclimatefix/ocf-data-sampler
git clone git@github.com/openclimatefix/ocf-data-sampler.git
conda create -n ocf-data-sampler python=3.11
Then go inside the ocf-data-sampler repo to add packages
pip install .
Then exit this environment, and enter back into the pvnet conda environment and install ocf-data-sampler in editable mode (-e). This means the package is directly linked to the source code in the ocf-data-sampler repo.
pip install -e <PATH-TO-ocf-data-sampler-REPO>
If you install the local version of ocf-data-sampler that is more recent than the version
specified in PVNet it is not guarenteed to function properly with this library.
Set up and config example for streaming
We will use the following example config file to describe your data sources: /PVNet/configs/datamodule/configuration/example_configuration.yaml. Ensure that the file paths are set to the correct locations in example_configuration.yaml: search for PLACEHOLDER to find where to input the location of the files. Delete or comment the parts for data you are not using.
At run time, the datamodule config PVNet/configs/datamodule/streamed_samples.yaml points to your chosen configuration file:
configuration: "/FULL-PATH-TO-REPO/PVNet/configs/datamodule/configuration/example_configuration.yaml"
You can also update train/val/test time ranges here to match the period you have access to.
If downloading private data from a GCP bucket make sure to authenticate gcloud (the public satellite data does not need authentication):
gcloud auth login
You can provide multiple storage locations as a list. For example:
satellite: zarr_path: - "gs://public-datasets-eumetsat-solar-forecasting/satellite/EUMETSAT/SEVIRI_RSS/v4/2020_nonhrv.zarr" - "gs://public-datasets-eumetsat-solar-forecasting/satellite/EUMETSAT/SEVIRI_RSS/v4/2021_nonhrv.zarr"
ocf-data-sampler is currently set up to use 11 channels from the satellite data (the 12th, HRV, is not used).
⚠️ NB: Our publicly accessible satellite data is currently saved with a blosc2 compressor, which is not supported by the tensorstore backend PVNet relies on now. We are in the process of updating this; for now, the paths above cannot be used with this codebase.
Training PVNet
How PVNet is run is determined by the configuration files. The example configs in PVNet/configs.example work with streamed_samples using datamodule/streamed_samples.yaml.
Update the following before training:
- In
configs/model/late_fusion.yaml:- Update the list of encoders to match the data sources you are using. For different NWP sources, keep the same structure but ensure:
in_channels: the number of variables your NWP source suppliesimage_size_pixels: spatial crop matching your NWP resolution and the settings in your datamodule configuration (unless you coarsened, e.g. for ECMWF)
- Update the list of encoders to match the data sources you are using. For different NWP sources, keep the same structure but ensure:
- In
configs/trainer/default.yaml:- Set
accelerator: 0if running on a system without a supported GPU
- Set
- In
configs/datamodule/streamed_samples.yaml:- Point
configuration:to your localexample_configuration.yaml(or your custom one) - Adjust the train/val/test time ranges to your available data
- Point
If you create custom config files, update the main ./configs/config.yaml defaults:
defaults:
- trainer: default.yaml
- model: late_fusion.yaml
- datamodule: streamed_samples.yaml
- callbacks: null
- experiment: null
- hparams_search: null
- hydra: default.yaml
Now train PVNet:
python run.py
You can override any setting with Hydra, e.g.:
python run.py datamodule=streamed_samples datamodule.configuration="/FULL-PATH/PVNet/configs/datamodule/configuration/example_configuration.yaml"
Backtest
If you have successfully trained a PVNet model and have a saved model checkpoint you can create a backtest using this, e.g. forecasts on historical data to evaluate forecast accuracy/skill. This can be done by running one of the scripts in this repo such as the UK GSP backtest script or the the pv site backtest script, further info on how to run these are in each backtest file.
Testing
You can use python -m pytest tests to run tests
Contributors ✨
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pvnet-5.2.1.tar.gz.
File metadata
- Download URL: pvnet-5.2.1.tar.gz
- Upload date:
- Size: 34.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ab8c0258ead81b4824fc0ebc3cda0147a05fae7592ab2a5c6c6de64b475f640c
|
|
| MD5 |
c0407900c28ad5921e2c206f1762be48
|
|
| BLAKE2b-256 |
6d054c48d3fd8e298dbebecb5f5361086336e0c5a6973cc7192fcf780b0b0d19
|
File details
Details for the file pvnet-5.2.1-py3-none-any.whl.
File metadata
- Download URL: pvnet-5.2.1-py3-none-any.whl
- Upload date:
- Size: 41.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fa0a8ca9d0ac3fa1cf55bcbc4f577c843de2157847658cf2d69cd28a9e61a52b
|
|
| MD5 |
6c78c7ef5cacc8c621aae8d591de6baa
|
|
| BLAKE2b-256 |
395580fa97642e52d5cf84ebf8b25bf90234309e7af064754f8d080b72c18c96
|