An intuitive and modular simulator for assessing the marginal value of a client's contribution in a decentralized setting.
Project description
Setting Configuration
In order to run the simulation, the Orchestrator
instance must receive a settings
object that contains all the necessary parameters. It is possible to store those parameters in a JSON
format and load them as the Python dictionary by using asociita.utils.helper.load_from_json
function. Below is an exemplary settings object embedded as a json
file. All the elements are necessary unless stated otherwise.
{
"orchestrator":{
"iterations": int,
"number_of_nodes": int,
"local_warm_start": bool,
"sample_size": int,
"evaluation": "none" | "full"
"save_metrics": bool,
"save_models": bool,
"save_path": str
"nodes": [0,
1,
2]
},
"nodes":{
"local_epochs": int,
"model_settings": {
"optimizer": "RMS",
"batch_size": int,
"learning_rate": float}
}
}
The settings
contains two dictionaries: orchestrator
and nodes
.
orchestrator
contains all the settings necessary details of the training:
iterations
is the number of rounds to be performed. Example:iterations:12
number_of_nodes
is the number of nodes that will be included in the training. Example:number_of_nodes: 10
local_warm_start
allows to distribute various pre-trained weights to different local clients. Not implemeneted yet. Example:local_warm_start: false
.sample_size
is the size of the sample that will be taken each round. Example:sample_size : 4.
evaluation
allows to control the evaluation procedure across the clients. Currently, onlynone
orfull
are supported. Setting the evaluation to full will perform a full evaluation of every client included in the training. Example:evaluation: full
save_metrics
allows to control whether the metrics should be saved in a csv file. Example:save_metrics: true.
save_models
allows to control whether the models should be saved. Not implemeneted yet. Example:save_metrics: false
.save_path
is the system path that will be used when saving the model. It is possible to define a saving_path in a method call.nodes
is the list containing the ids of all the nodes participating in the training. Length ofnodes
must be equalnumber_of_nodes
.
nodes
contains all the necessary configuration for nodes.
"local_epochs":
the number of local epochs to be performed on the local nodes."model_settings"
is a dictionary containing all the parameters for training the model.optimizer
is an optimizer that will be used during the training. Example:optimizer: "RMS"
batch_size
is the batch size that will be used during the training. Example:batch_size: 32
learning_rate
is the learning rate that will be used during the training. Example:learning_rate: 0.001
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file asociita-0.2.1.tar.gz
.
File metadata
- Download URL: asociita-0.2.1.tar.gz
- Upload date:
- Size: 34.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.0 CPython/3.10.11 Linux/5.19.0-38-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b16b7c5c8be829de6ce1488427705e377420c7c73d058972533519f0cc23233 |
|
MD5 | 7f6e80a53e2f2cb25d71962736b5cdb4 |
|
BLAKE2b-256 | 55813743d90ab173df09e28dfab921b053873b47f23f1425e04d222cac36b883 |
File details
Details for the file asociita-0.2.1-py3-none-any.whl
.
File metadata
- Download URL: asociita-0.2.1-py3-none-any.whl
- Upload date:
- Size: 50.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.0 CPython/3.10.11 Linux/5.19.0-38-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ac04d2494559f49804157a3d8a09a762200c5f29f8dad44dcce1489cb3b0b4bc |
|
MD5 | 95a9fe43cc61bb6e0ce911452bd03888 |
|
BLAKE2b-256 | 4a4d802fc807cadd667715fb37d537b0972acac2d4a2d61a1ba6447059ec0dd0 |