Skip to main content

opennem engine agent

Project description

OpenNEM Energy Market Data Access

The OpenNEM project aims to make the wealth of public National Electricity Market (NEM) data more accessible to a wider audience.

This toolkit enables downloading, mirroring and accessing energy data from various networks

Project homepage at https://opennem.org.au

Available on Docker at https://hub.docker.com/r/opennem/opennem

Currently supports:

Install

You can install this project with python pip:

$ pip install opennem

Or alternatively with docker:

$ docker pull opennem/opennem

Bundled with sqlite support. Other database drivers are optional and not installed by default. Install a supported database driver:

Postgres:

$ pip install psycopg2

Install Extras

The package contains extra modules that can be installed:

$ poetry install -E postgres

The list of extras are:

  • postgres - Postgres database drivers
  • server - API server

Usage

List the crawlers

$ scrapy list

Crawl

$ scrapy crawl au.nem.current.dispatch_scada

Development

Setting up a virtual environment and installing requiements using Poetry:

$ poetry install
$ source .venv/bin/activate

Settings are read from environment variables. Environment variables can be read from a .env file in the root of the folder. Setup the environment by copying the .env.sample file to .env. The defaults in the sample file map to the settings in docker-compose.yml

There is a docker-compose file that will bring a local database:

$ docker-compose up -d

Bring up the database migrations using alembic:

$ alembic upgrade head

Run scrapy in the root folder for options:

$ scrapy

The opennem cli interface provides other options and settings:

$ opennem -h

Settings for Visual Studio Code are stored in .vscode. Code is kept formatted and linted using pylint, black and isort with settings defined in pyproject.toml

Build Release

The script build-release.sh will tag a new release, build the docker image, tag the git version, push to GitHub and push the latest release to PyPi

Architecture overview

This project uses Scrapy to obtain data from supported energy markets and SQLAlchemy to store data, and Alembic for database migrations. Database storage has been tested with sqlite, postgres and mysql.

Overview of scrapy architecture:

Code Navigation

  • Spider definitions in opennem/spiders
  • Processing pipelines for crawls in opennem/pipelines
  • Database models for supported energy markets are stored in opennem/db/models

Deploy Crawlers

You can deploy the crawlers to the scrapyd server with:

$ scrapyd-deploy

If you don't have that command and it isn't available install it with:

$ pip install scrapyd-client

Which installs the scrapyd-client tools. Project settings are read from scrapy.cfg

Project details


Release history Release notifications | RSS feed

This version

0.4.8

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opennem-0.4.8.tar.gz (76.6 kB view hashes)

Uploaded Source

Built Distribution

opennem-0.4.8-py3-none-any.whl (99.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page