Data Model for the OCF nowcasting project
Project description
nowcasting_datamodel
Datamodel for the nowcasting project
The data model has been made using sqlalchemy
with a mirrored model in pydantic
.
⚠️ Database tables are currently made automatically, but in the future there should be a migration process
Future: The data model could be moved, to be a more modular solution.
nowcasting_datamodel
models.py
All models are in nowcasting_datamodel.models.py
.
The diagram below shows how the different tables are connected.
connection.py
nowcasting_datamodel.connection.py
contains a connection class which can be used to make a sqlalchemy session.
from nowcasting_datamodel.connection import DatabaseConnection
# make connection object
db_connection = DatabaseConnection(url='sqlite:///test.db')
# make sessions
with db_connection.get_session() as session:
# do something with the database
pass
👓 read.py
nowcasting_datamodel.read.py
contains functions to read the database.
The idea is that these are easy to use functions that query the database in an efficient and easy way.
- get_latest_forecast: Get the latest
Forecast
for a specific GSP. - get_all_gsp_ids_latest_forecast: Get the latest
Forecast
for all GSPs. - get_forecast_values: Gets the latest
ForecastValue
for a specific GSP - get_latest_national_forecast: Returns the latest national forecast
- get_location: Gets a
Location
object
from nowcasting_datamodel.connection import DatabaseConnection
from nowcasting_datamodel.read import get_latest_forecast
# make connection object
db_connection = DatabaseConnection(url='sqlite:///test.db')
# make sessions
with db_connection.get_session() as session:
f = get_latest_forecast(session=session, gsp_id=1)
💾 save.py
nowcasting_datamodel.save.py
has one functions to save a list of Forecast
to the database
🇬🇧 national.py
nowcasting_datamodel.fake.py
has a useful function for adding up forecasts for all GSPs into a national Forecast.
fake.py
nowcasting_datamodel.fake.py
Functions used to make fake model data.
🩺 Testing
Tests are run by using the following command
docker-compose -f test-docker-compose.yml run tests
These sets up postgres
in a docker container and runs the tests in another docker container.
This slightly more complicated testing framework is needed (compared to running pytest
)
as some queries can not be fully tested on a sqlite
database
🛠️ infrastructure
.github/workflows
contains a number of CI actions
- linters.yaml: Runs linting checks on the code
- release.yaml: Make and pushes docker files on a new code release
- test-docker.yaml': Runs tests on every push
The docker file is in the folder infrastructure/docker/
The version is bumped automatically for any push to main
.
Environmental Variables
- DB_URL: The database url which the forecasts will be saved too
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nowcasting_datamodel-0.0.27.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 475ad520f13ffb1a753de1956d718a21c9094e71805291a10b3936a00ffdb00f |
|
MD5 | 7a10759ceef1748b8d0b8b3b728a1b3f |
|
BLAKE2b-256 | 3fcafef0f08f135c4b5c161cdd8e0a661cc81b39875fb0c9d00e44d80016cd9f |
Hashes for nowcasting_datamodel-0.0.27-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 179a6d4b052aeaf2b1dadc2459edf71717048aed6e2f3795893af2aab13f6a4c |
|
MD5 | e8f82755b76b9dd8b70f4441adb2161d |
|
BLAKE2b-256 | 029808048b6080a0a559cad7a0595c6308b0f80543c0d74b0eb8afbf4021af25 |