Data Model for the OCF nowcasting project
Project description
nowcasting_datamodel
Datamodel for the nowcasting project
The data model has been made using sqlalchemy
with a mirrored model in pydantic
.
⚠️ Database tables are currently made automatically, but in the future there should be a migration process
Future: The data model could be moved, to be a more modular solution.
nowcasting_datamodel
models.py
All models are in nowcasting_datamodel.models.py
.
The diagram below shows how the different tables are connected.
connection.py
nowcasting_datamodel.connection.py
contains a connection class which can be used to make a sqlalchemy session.
from nowcasting_datamodel.connection import DatabaseConnection
# make connection object
db_connection = DatabaseConnection(url='sqlite:///test.db')
# make sessions
with db_connection.get_session() as session:
# do something with the database
pass
👓 read.py
nowcasting_datamodel.read.py
contains functions to read the database.
The idea is that these are easy to use functions that query the database in an efficient and easy way.
- get_latest_forecast: Get the latest
Forecast
for a specific GSP. - get_all_gsp_ids_latest_forecast: Get the latest
Forecast
for all GSPs. - get_forecast_values: Gets the latest
ForecastValue
for a specific GSP - get_latest_national_forecast: Returns the latest national forecast
- get_location: Gets a
Location
object
from nowcasting_datamodel.connection import DatabaseConnection
from nowcasting_datamodel.read import get_latest_forecast
# make connection object
db_connection = DatabaseConnection(url='sqlite:///test.db')
# make sessions
with db_connection.get_session() as session:
f = get_latest_forecast(session=session, gsp_id=1)
💾 save.py
nowcasting_datamodel.save.py
has one functions to save a list of Forecast
to the database
🇬🇧 national.py
nowcasting_datamodel.fake.py
has a useful function for adding up forecasts for all GSPs into a national Forecast.
fake.py
nowcasting_datamodel.fake.py
Functions used to make fake model data.
🩺 Testing
Tests are run by using the following command
docker stop $(docker ps -a -q)
docker-compose -f test-docker-compose.yml build
docker-compose -f test-docker-compose.yml run tests
These sets up postgres
in a docker container and runs the tests in another docker container.
This slightly more complicated testing framework is needed (compared to running pytest
)
as some queries can not be fully tested on a sqlite
database
🛠️ infrastructure
.github/workflows
contains a number of CI actions
- linters.yaml: Runs linting checks on the code
- release.yaml: Make and pushes docker files on a new code release
- test-docker.yaml': Runs tests on every push
The docker file is in the folder infrastructure/docker/
The version is bumped automatically for any push to main
.
Environmental Variables
- DB_URL: The database url which the forecasts will be saved too
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nowcasting_datamodel-1.1.55.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3df3e71be064884c827ed7306ebbb0e8f3ecf501140421529329f585b793ff7b |
|
MD5 | 20c3b0e22053afb4cd0767563e6ddf97 |
|
BLAKE2b-256 | 470faa786f8bf91f506c9a8ea9c3495d6ca8b4b5d4723e9f7bcf5c325c2a377a |
Hashes for nowcasting_datamodel-1.1.55-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | de17d58795c07abadd0835df020f963dd85251ef086b231405dc6f59e0b90fc0 |
|
MD5 | 73b5756407ef5c33d76df31b75aec582 |
|
BLAKE2b-256 | 558d1131e98dede14c513f654bda3d62d090beda3c3274801c44d9b0a2883ce2 |