Skip to main content

No project description provided

Project description

Batch Prediction Pipeline

Check out this Medium article for more details about this module.

Create Environment File

~/energy-forecasting $ cp .env.default .env

The command cp .env.default .env is used to create a copy of the .env.default file and name it .env. In many projects, the .env file is used to store environment variables that the application needs to run. The .env.default file is usually a template that includes all the environment variables that the application expects, but with default values. By copying it to .env, you can customize these values for your own environment.

Set Up the ML_PIPELINE_ROOT_DIR Variable

~/energy-forecasting $ export ML_PIPELINE_ROOT_DIR=$(pwd)

The command export ML_PIPELINE_ROOT_DIR=$(pwd) is setting the value of the ML_PIPELINE_ROOT_DIR environment variable to the current directory. In this context, $(pwd) is a command substitution that gets replaced with the output of the pwd command, which prints the path of the current directory. The export command then makes this variable available to child processes of the current shell.

In essence, ML_PIPELINE_ROOT_DIR is an environment variable that is set to the path of the current directory. This can be useful for scripts or programs that need to reference the root directory of the ML pipeline, as they can simply refer to ML_PIPELINE_ROOT_DIR instead of needing to know the exact path.

Install for Development

The batch prediction pipeline uses the training pipeline module as a dependency. Thus, as a first step, we must ensure that the training pipeline module is published to our private PyPi server.

NOTE: Make sure that your private PyPi server is running. Check the Usage section if it isn't.

Build & publish the training-pipeline to your private PyPi server:

cd training-pipeline
poetry build
poetry publish -r my-pypi
cd ..

Install the virtual environment for batch-prediction-pipeline:

~/energy-forecasting                           $ cd batch-prediction-pipeline && rm poetry.lock
~/energy-forecasting/batch-prediction-pipeline $ bash ../scripts/devops/virtual_environment/poetry_install.sh
~/energy-forecasting/batch-prediction-pipeline $ source .venv/bin/activate

Check the Set Up Additional Tools and Usage sections to see how to set up the additional tools and credentials you need to run this project.

Usage for Development

To start batch prediction script, run:

~/energy-forecasting/batch-prediction-pipeline $ python -m batch_prediction_pipeline.batch

To compute the monitoring metrics based, run the following:

~/energy-forecasting/batch-prediction-pipeline $ python -m batch_prediction_pipeline.monitoring

NOTE: Be careful to set the ML_PIPELINE_ROOT_DIR variable as explained in this section.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

g_batch_prediction_pipeline-0.2.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

g_batch_prediction_pipeline-0.2.0-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file g_batch_prediction_pipeline-0.2.0.tar.gz.

File metadata

  • Download URL: g_batch_prediction_pipeline-0.2.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.9 Darwin/22.3.0

File hashes

Hashes for g_batch_prediction_pipeline-0.2.0.tar.gz
Algorithm Hash digest
SHA256 88c1bcf040b598a649237625277ec7abed97f85efba8f9112c31a8c213ec42ba
MD5 4b158f6485e30992c59d50be089a3da5
BLAKE2b-256 78ebf8617b568f64e86b8a163fa8e7693a0230176d8ffaff8bdf40591511b906

See more details on using hashes here.

File details

Details for the file g_batch_prediction_pipeline-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for g_batch_prediction_pipeline-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b03cc1a9c98d0a35c24274f8d5a04a21f29d260e4e8225cffe2fb787bf5e883a
MD5 88bdb057f5bf3c470390473474880c79
BLAKE2b-256 7517cb09332c9171416423acbeb5eefe96f7387a7b7b36eaf732d00bb12d8ec8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page