Skip to main content

No project description provided

Project description

Batch Prediction Pipeline

Check out this Medium article for more details about this module.

Create Environment File

~/energy-forecasting $ cp .env.default .env

The command cp .env.default .env is used to create a copy of the .env.default file and name it .env. In many projects, the .env file is used to store environment variables that the application needs to run. The .env.default file is usually a template that includes all the environment variables that the application expects, but with default values. By copying it to .env, you can customize these values for your own environment.

Set Up the ML_PIPELINE_ROOT_DIR Variable

~/energy-forecasting $ export ML_PIPELINE_ROOT_DIR=$(pwd)

The command export ML_PIPELINE_ROOT_DIR=$(pwd) is setting the value of the ML_PIPELINE_ROOT_DIR environment variable to the current directory. In this context, $(pwd) is a command substitution that gets replaced with the output of the pwd command, which prints the path of the current directory. The export command then makes this variable available to child processes of the current shell.

In essence, ML_PIPELINE_ROOT_DIR is an environment variable that is set to the path of the current directory. This can be useful for scripts or programs that need to reference the root directory of the ML pipeline, as they can simply refer to ML_PIPELINE_ROOT_DIR instead of needing to know the exact path.

Install for Development

The batch prediction pipeline uses the training pipeline module as a dependency. Thus, as a first step, we must ensure that the training pipeline module is published to our private PyPi server.

NOTE: Make sure that your private PyPi server is running. Check the Usage section if it isn't.

Build & publish the training-pipeline to your private PyPi server:

cd training-pipeline
poetry build
poetry publish -r my-pypi
cd ..

Install the virtual environment for batch-prediction-pipeline:

~/energy-forecasting                           $ cd batch-prediction-pipeline && rm poetry.lock
~/energy-forecasting/batch-prediction-pipeline $ bash ../scripts/devops/virtual_environment/poetry_install.sh
~/energy-forecasting/batch-prediction-pipeline $ source .venv/bin/activate

Check the Set Up Additional Tools and Usage sections to see how to set up the additional tools and credentials you need to run this project.

Usage for Development

To start batch prediction script, run:

~/energy-forecasting/batch-prediction-pipeline $ python -m batch_prediction_pipeline.batch

To compute the monitoring metrics based, run the following:

~/energy-forecasting/batch-prediction-pipeline $ python -m batch_prediction_pipeline.monitoring

NOTE: Be careful to set the ML_PIPELINE_ROOT_DIR variable as explained in this section.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

g_batch_prediction_pipeline-0.1.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

g_batch_prediction_pipeline-0.1.0-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file g_batch_prediction_pipeline-0.1.0.tar.gz.

File metadata

  • Download URL: g_batch_prediction_pipeline-0.1.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.9 Darwin/22.3.0

File hashes

Hashes for g_batch_prediction_pipeline-0.1.0.tar.gz
Algorithm Hash digest
SHA256 7d86cb5afb843bd1136bd64d45bd18de5e4d83f1dc6634f52e752947b30b736d
MD5 04fdd302f21adf9241796586969f91eb
BLAKE2b-256 ae53452ae1fe96867c7391eed2cfe27cfa45d09085913513c74137b49eb8a6a9

See more details on using hashes here.

File details

Details for the file g_batch_prediction_pipeline-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for g_batch_prediction_pipeline-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bf9abf4c77efa2d3b04da05b327c0a77e36ca5f158febdf4dd9bccc6f6476b83
MD5 5c988a0af4ea8e5b4622840409ee1966
BLAKE2b-256 ac686689d916feeb718ea80a4633de948de518acb01be408cf83b1dd8c095575

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page