Skip to main content

Distributed Optimizer

Project description

How to use

We'll be running distributed_optimizer.py to start the optimization.

distributed_optimizer.py can be run in 2 modes: host mode and client mode. The host mode will call 2 processes: One that starts the Optimizer of choice and One that connect with other machines (the clients) to run distributed_optimizer.py on client mode. The client mode will run the user-specified MyWhateverTrainer. Host mode: python3 distributed_optimizer.py as host mode is the default. Client mode: python3 distributed_optimizer.py --run_as=client as client mode. You do not need to manually run client mode on each client machine as the host mode will do this. However, you can use it if you want to add client machines during optimization.

Essential parts of the script:

  • The COMMANDS: A constant dictionary. The keys are machine categories, and values are the necessary commands to run distributed_optimizer.py on client mode on those different categories.

  • A MyWhateverTrainer that inherits the Trainer abstract class from src.trainer, and implement the abstract method get_observation, in which the set of hyperparameters (candidate) given will be plugged into the objective function.

  • A start_host() function that will be used to call 2 processes: one that start the Optimizer and one that runs appropriate sequence of commands to run distributed_optimizer.py on client mode, i.e. startTrainers on respective machines using the following: python3 distributed_optimizer.py --run_as=client

  • A start_client() function that will run MyWhateverTrainer, i.e. run objective function.

  • A main() that parse command line input and switch between host and client mode, and specify further information needed to run objective function on target machines.

Check list:

  • Step 1: Create your specified MyWhatever Trainer that implements the get_observation method.
  • Step 2: Make sure the appropriate environment can be chosen through ssh tunneling. Try: ssh [name]@[hostmachine] [YOUR COMMANDS]
  • Step 3: Add the commands to the COMMANDS dictionary.
  • Step 4: Make sure you have a copy of the distributed_optimizer.py and related files on all of the machines you intend to use

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dopt-0.0.6.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

dopt-0.0.6-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file dopt-0.0.6.tar.gz.

File metadata

  • Download URL: dopt-0.0.6.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.13

File hashes

Hashes for dopt-0.0.6.tar.gz
Algorithm Hash digest
SHA256 0908d4de942a021501d6845490ed65719e677317f728fa83e74e002aec884c6c
MD5 191775a6129c762e9cecf645ad4b69ba
BLAKE2b-256 b558752f6d44bc5d82b6d99253dfb39ff17166b2506a359feec3b6019d437a1b

See more details on using hashes here.

File details

Details for the file dopt-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: dopt-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 21.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.13

File hashes

Hashes for dopt-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 9e7ba6a30e8a94c8a50515e2af7973608184efe5d482ae944a086823cb258061
MD5 31757c1b45199fb60f9c6d85db111425
BLAKE2b-256 f49882a4c46615812d9fbad98e4e5084f44f871757f21428fcb3dbe809f8ed32

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page