Distributed Optimizer
Project description
How to use
We'll be running distributed_optimizer.py to start the optimization.
distributed_optimizer.py can be run in 2 modes: host mode and client mode.
The host mode will call 2 processes: One that
starts the Optimizer of choice and One that connect with other machines (the clients) to run distributed_optimizer.py on client mode.
The client mode will run the user-specified MyWhateverTrainer.
Host mode: python3 distributed_optimizer.py as host mode is the default.
Client mode: python3 distributed_optimizer.py --run_as=client as client mode. You do not need to manually run client mode on each client machine as the host mode will do this. However, you can use it if you want to add client machines during optimization.
Essential parts of the script:
-
The COMMANDS: A constant dictionary. The keys are machine categories, and values are the necessary commands to run
distributed_optimizer.pyon client mode on those different categories. -
A MyWhateverTrainer that inherits the
Trainerabstract class fromsrc.trainer, and implement the abstract methodget_observation, in which the set of hyperparameters (candidate) given will be plugged into the objective function. -
A
start_host()function that will be used to call 2 processes: one that start theOptimizerand one that runs appropriate sequence of commands to rundistributed_optimizer.pyon client mode, i.e. startTrainerson respective machines using the following:python3 distributed_optimizer.py --run_as=client -
A
start_client()function that will run MyWhateverTrainer, i.e. run objective function. -
A
main()that parse command line input and switch between host and client mode, and specify further information needed to run objective function on target machines.
Check list:
- Step 1: Create your specified
MyWhatever Trainerthat implements theget_observationmethod. - Step 2: Make sure the appropriate environment can be chosen through ssh tunneling. Try:
ssh [name]@[hostmachine] [YOUR COMMANDS] - Step 3: Add the commands to the
COMMANDSdictionary. - Step 4: Make sure you have a copy of the
distributed_optimizer.pyand related files on all of the machines you intend to use
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dopt-0.0.6.tar.gz.
File metadata
- Download URL: dopt-0.0.6.tar.gz
- Upload date:
- Size: 17.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0908d4de942a021501d6845490ed65719e677317f728fa83e74e002aec884c6c
|
|
| MD5 |
191775a6129c762e9cecf645ad4b69ba
|
|
| BLAKE2b-256 |
b558752f6d44bc5d82b6d99253dfb39ff17166b2506a359feec3b6019d437a1b
|
File details
Details for the file dopt-0.0.6-py3-none-any.whl.
File metadata
- Download URL: dopt-0.0.6-py3-none-any.whl
- Upload date:
- Size: 21.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9e7ba6a30e8a94c8a50515e2af7973608184efe5d482ae944a086823cb258061
|
|
| MD5 |
31757c1b45199fb60f9c6d85db111425
|
|
| BLAKE2b-256 |
f49882a4c46615812d9fbad98e4e5084f44f871757f21428fcb3dbe809f8ed32
|