Skip to main content

A Python interface for the PV_Live web API from Sheffield Solar.

Project description

PV_Live

A Python implementation of the PV_Live web API. See https://www.solar.sheffield.ac.uk/pvlive/

Latest Version: 0.12

New! Updated 2022-07-19 to use the v4 PV_Live API.

About this repository

  • This Python library provides a convenient interface for the PV_Live web API to facilitate accessing PV_Live results in Python code.

  • Developed and tested with Python 3.8, should work with Python 3.5+. Support for Python 2.7+ has been discontinued as of 2021-01-15.

How do I get set up?

  • Make sure you have Git installed - Download Git

  • Run pip install git+https://github.com/SheffieldSolar/PV_Live-API

Usage

There are three methods for extracting raw data from the PV_Live API:

|Method|Description|Docs Link|

|------|-----------|---------|

|PVLive.latest(entity_type="pes", entity_id=0, extra_fields="", period=30, dataframe=False)|Get the latest PV_Live generation result from the API.|🔗|

|PVLive.at_time(dt, entity_type="pes", entity_id=0, extra_fields="", period=30, dataframe=False)|Get the PV_Live generation result for a given time from the API.|🔗|

|PVLive.between(start, end, entity_type="pes", entity_id=0, extra_fields="", period=30, dataframe=False)|Get the PV_Live generation result for a given time interval from the API.|🔗|

There are two methods for extracting derived statistics:

|Method|Description|Docs Link|

|------|-----------|---------|

|PVLive.day_peak(d, entity_type="pes", entity_id=0, extra_fields="", period=30, dataframe=False)|Get the peak PV_Live generation result for a given day from the API.|🔗|

|PVLive.day_energy(d, entity_type="pes", entity_id=0)|Get the cumulative PV generation for a given day from the API.|🔗|

These methods include the following optional parameters:

|Parameter|Usage|

|---------|-----|

|entity_type|Choose between "pes" or "gsp". If querying for national data, this parameter can be set to either value (or left to it's default value) since setting entity_id to 0 will always return national data.|

|entity_id|Set entity_id=0 (the default value) to return nationally aggregated data. If entity_type="pes", specify a pes_id to retrieve data for, else if entity_id="gsp", specify a gsp_id. For a full list of GSP and PES IDs, refer to the lookup table hosted on National Grid ESO's data portal here.|

|extra_fields|Use this to extract additional fields from the API such as installedcapacity_mwp. For a full list of available fields, see the PV_Live API Docs.|

|period|Set the desired temporal resolution (in minutes) for PV outturn estimates. Options are 30 (default) or 5.|

|dataframe|Set dataframe=True and the results will be returned as a Pandas DataFrame object which is generally much easier to work with. The columns of the DataFrame will be pes_id or gsp_id, datetime_gmt, generation_mw, plus any extra fields specified.|

Code Examples

See pvlive_api_demo.py for more example usage.

The examples below assume you have imported the PVLive class and created a local instance called pvl:

from datetime import datetime

import pytz



from pvlive_api import PVLive



pvl = PVLive()

|Example|Code|Example Output|

|-------|----|------|

|Get the latest nationally aggregated GB PV outturn|pvl.latest()|(0, '2021-01-20T11:00:00Z', 203.0)|

|Get the latest aggregated outturn for PES region 23 (Yorkshire)|pvl.latest(entity_id=23)|(23, '2021-01-20T14:00:00Z', 5.8833031)

|Get the latest aggregated outturn for GSP ID 120 (INDQ1 or "Indian Queens")|pvl.latest(entity_type="gsp", entity_id=120)|(120, '2021-01-20T14:00:00Z', 1, 3.05604)

|Get the nationally aggregated GB PV outturn for all of 2020 as a DataFrame|pvl.between(start=datetime(2020, 1, 1, 0, 30, tzinfo=pytz.utc), end=datetime(2021, 1, 1, tzinfo=pytz.utc), dataframe=True)|Screenshot of output|

|Get a list of GSP IDs|pvl.gsp_ids|array([ 0, 1, 2, 3, ..., 336, 337, 338])|

|Get a list of PES IDs|pvl.pes_ids|array([ 0, 1, 2, 3, ..., 336, 337, 338])|

To download data for all GSPs, use something like:

def download_pvlive_by_gsp(start, end, include_national=True, extra_fields=""):

    data = None

    pvl = PVLive()

    min_gsp_id = 0 if include_national else 1

    for gsp_id in pvl.gsp_ids:

        if gsp_id < min_gsp_id:

            continue

        data_ = pvl.between(start=start, end=end, entity_type="gsp", entity_id=gsp_id,

                            dataframe=True, extra_fields=extra_fields)

        if data is None:

            data = data_

        else:

            data = pd.concat((data, data_), ignore_index=True)

    return data

Command Line Utilities

pv_live

This utility can be used to download data to a CSV file:


>> pv_live -h

usage: pvlive.py [-h] [-s "<yyyy-mm-dd HH:MM:SS>"] [-e "<yyyy-mm-dd HH:MM:SS>"] [--entity_type <entity_type>] [--entity_id <entity_id>]

                 [--period <5|30>] [-q] [-o </path/to/output/file>]



This is a command line interface (CLI) for the PV_Live API module



optional arguments:

  -h, --help            show this help message and exit

  -s "<yyyy-mm-dd HH:MM:SS>", --start "<yyyy-mm-dd HH:MM:SS>"

                        Specify a UTC start date in 'yyyy-mm-dd HH:MM:SS' format (inclusive), default behaviour is to retrieve the latest outturn.

  -e "<yyyy-mm-dd HH:MM:SS>", --end "<yyyy-mm-dd HH:MM:SS>"

                        Specify a UTC end date in 'yyyy-mm-dd HH:MM:SS' format (inclusive), default behaviour is to retrieve the latest outturn.

  --entity_type <entity_type>

                        Specify an entity type, either 'gsp' or 'pes'. Default is 'pes'.

  --entity_id <entity_id>

                        Specify an entity ID, default is 0 (i.e. national).

  --period <5|30>       Desired temporal resolution (in minutes) for PV outturn estimates. Default is 30.

  -q, --quiet           Specify to not print anything to stdout.

  -o </path/to/output/file>, --outfile </path/to/output/file>

                        Specify a CSV file to write results to.



Jamie Taylor, 2018-06-04

Using the Docker Image

There is also a Docker Image hosted on Docker Hub which can be used to download data from the PV_Live API with minimal setup:


>> docker run -it --rm sheffieldsolar/pv_live-api:<release> pv_live -h

Documentation

How do I upgrade?

Sheffield Solar will endeavour to update this library in sync with the PV_Live API and ensure the latest version of this library always supports the latest version of the PV_Live API, but cannot guarantee this. To make sure you are forewarned of upcoming changes to the API, you should email solar@sheffield.ac.uk and request to be added to the PV_Live user mailing list.

To upgrade the code:

  • Run pip install --upgrade git+https://github.com/SheffieldSolar/PV_Live-API

Who do I talk to?

Authors

License

No license is defined yet - use at your own risk.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pvlive_api-0.12.tar.gz (14.4 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page