Skip to main content

FedPredict is a personalization plugin for Federated Learning methods.

Project description

Welcome to FedPredict

The first-ever Federated Learning plugin!

FedPredict is a Federated Learning (FL) plugin that can significantly improve FL solutions without requiring additional training or expensive processing. FedPredict enables personalization for traditional methods, such as FedAvg and FedYogi. It is also a modular plugin that operates in the prediction stage of FL without requiring any modification in the training step. This project has been developed in the laboratories WISEMAP (UFMG), H.IAAC (UNICAMP), and NESPED (UFV).

The list of projects that use FedPredict is the following (updating):

  • FL-H.IAAC: it has the code of the experiments of FedPredict papers in IEEE DCOSS-IoT 2023 and 2024 (i.e., FedPredict and FedPredict-Dynamic).
  • PFLib (will be available soon).
  • PyFlexe (will be available soon).

Documentation

Please access the FedPredict documentation for tutorials and API details.

Why FedPredict?

It is better working with the prediction stage. See the comparison below!

How it works?

FedPredict intelligently combines global and local model parameters. In this process, it assigns more or less weight to each type of parameter according to various factors, such as the evolution level (el) of the global model, the update level (ul) of the local model, and the similarity (s) between the old data (i.e., the one in which the model was previously trained) and the recently acquired data). Then, the client uses the combined model to make predictions over the test/val data.

Benefits

The list of benefits of the plugin as listed as follows:

  1. High performance: achieve high performance in heterogeneous data.
  2. High efficiency for FL: achieve high performance even when training less.
  3. Concept drift-awareness: FedPredict makes the model almost instantly adapt to the new scenario when concept drift occurs.
  4. Task independent: apply FedPredict for any type of deep neural network task.
  5. Easy to use and modular: no modifications are necessary in the training stage of your solution!
  6. Lightweight: it is composed of simple operations.
  7. Low downlink communication cost: FedPredict server compresses global model parameters.

Just plug and play!

Installation

FedPredict is compatible with Python>=3.8 and is tested on the latest versions of Ubuntu. With your virtual environment opened, if you are using Torch type the following command to install FedPredict from Pypi:

    pip install fedpredict[torch]

If you are using Flower for FL simulation, type:

    pip install fedpredict[flwr]

FL requirements

In general, if your solution shares some level of similarity with FedAvg, then FedPredict is ready to use. The requirements are described as follows:

Requirement Description
Sharing all layers The clients have to upload all model layers at every round so the server can aggregate a global model that can be directly leveraged by a new client, as in FedAvg
Same model structure The layers of the global and local models have to have the same shape to allow the combination of parameters
Predicting using the combined model On the client side, the original method has to be flexible enough to make predictions based on the combined model; otherwise, the plugin will have no effect

Components

Our solution has two main components: FedPredict client and FedPredict server. Their objectives are described below:

Components Objective
FedPredict Client Transfer the knowledge from the updated global model to the client's stale local model
FedPredict server Compresses the updated global model parameters to further send to the clients. Used together with FedPredict client

Citing

If FedPredict has been useful to you, please cite our paper. The BibTeX is presented as follows:

@inproceedings{capanema2023fedpredict,
  title={FedPredict: Combining Global and Local Parameters in the Prediction Step of Federated Learning},
  author={Capanema, Cl{\'a}udio GS and de Souza, Allan M and Silva, Fabr{\'\i}cio A and Villas, Leandro A and Loureiro, Antonio AF},
  booktitle={2023 19th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT)},
  pages={17--24},
  year={2023},
  doi={https://doi.org/10.1109/DCOSS-IoT58021.2023.00012},
  organization={IEEE}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fedpredict-0.0.18.tar.gz (26.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fedpredict-0.0.18-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file fedpredict-0.0.18.tar.gz.

File metadata

  • Download URL: fedpredict-0.0.18.tar.gz
  • Upload date:
  • Size: 26.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for fedpredict-0.0.18.tar.gz
Algorithm Hash digest
SHA256 6e43fbc7fdf325d33a9b0e92b3b310579507c0522481097d3c2c5af0c7be306b
MD5 2cdccb388781ebcc52dc0df0f0c6912b
BLAKE2b-256 729c844a0fee09aeaff9367b59f8e661ea7b459272ef47af617939c0ec668edf

See more details on using hashes here.

File details

Details for the file fedpredict-0.0.18-py3-none-any.whl.

File metadata

  • Download URL: fedpredict-0.0.18-py3-none-any.whl
  • Upload date:
  • Size: 29.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for fedpredict-0.0.18-py3-none-any.whl
Algorithm Hash digest
SHA256 f8e0489d20638cbc9769301f9ef24d6e28ae05bbee1acd1ef3a6c8341797bcb2
MD5 d43aea5df294a2162f8d7f4d1c4d071c
BLAKE2b-256 f1dec8b51851f25019f8da8a2707e64864e2e128474ee5a72b8e1ba014436e37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page