Skip to main content

No project description provided

Project description

Training Pipeline

Check out this Medium article for more details about this module.

Install for Development

~/energy-forecasting                   $ cd training-pipeline && rm poetry.lock
~/energy-forecasting/training-pipeline $ bash ../scripts/devops/virtual_environment/poetry_install.sh
~/energy-forecasting/training-pipeline $ source .venv/bin/activate

Check the Set Up Additional Tools and Usage sections to see how to set up the additional tools and credentials you need to run this project.

Create Environment File

~/energy-forecasting/training-pipeline $ cp .env.default .env

The command cp .env.default .env is used to create a copy of the .env.default file and name it .env. In many projects, the .env file is used to store environment variables that the application needs to run. The .env.default file is usually a template that includes all the environment variables that the application expects, but with default values. By copying it to .env, you can customize these values for your own environment.

Set Up the ML_PIPELINE_ROOT_DIR Variable

~/energy-forecasting/training-pipeline $ export ML_PIPELINE_ROOT_DIR=$(pwd)

The command export ML_PIPELINE_ROOT_DIR=$(pwd) is setting the value of the ML_PIPELINE_ROOT_DIR environment variable to the current directory. In this context, $(pwd) is a command substitution that gets replaced with the output of the pwd command, which prints the path of the current directory. The export command then makes this variable available to child processes of the current shell.

In essence, ML_PIPELINE_ROOT_DIR is an environment variable that is set to the path of the current directory. This can be useful for scripts or programs that need to reference the root directory of the ML pipeline, as they can simply refer to ML_PIPELINE_ROOT_DIR instead of needing to know the exact path.

Usage for Development

Run the scripts in the following order:

  1. Start the hyperparameter tuning script:

    ~/energy-forecasting/training-pipeline $ python -m training_pipeline.hyperparameter_tuning
    
  2. Upload the best config based on the previous hyperparameter tuning step:

    ~/energy-forecasting/training-pipeline $ python -m training_pipeline.best_config
    
  3. Start the training script using the best configuration uploaded one step before:

    ~/energy-forecasting/training-pipeline $ python -m training_pipeline.train
    

NOTE: Be careful to set the ML_PIPELINE_ROOT_DIR variable as explain in this section.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

g_training_pipeline-0.1.0.tar.gz (12.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

g_training_pipeline-0.1.0-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file g_training_pipeline-0.1.0.tar.gz.

File metadata

  • Download URL: g_training_pipeline-0.1.0.tar.gz
  • Upload date:
  • Size: 12.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.9 Darwin/22.3.0

File hashes

Hashes for g_training_pipeline-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a272ed67d5812827bd74d70654cc9d1f9f435c76e83ce700edb79e77ff481d9b
MD5 f6e726ed12c4b2b087dff3b96c3e1bec
BLAKE2b-256 5f633101bd3703459df0d37efb38aeb6da5274098ad9c1a3bdefa02735cc5e82

See more details on using hashes here.

File details

Details for the file g_training_pipeline-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for g_training_pipeline-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ee933990cc38d8035b6ca8f885042898acae3be0d0e42d1ab9261ab8cfb711bd
MD5 c385ac0a3f23158e0e72a44b53b7bcf4
BLAKE2b-256 257b6ab2d7ed206832ddeb411b7e067b58c8adf34f481d3ef7b9e180db7836af

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page