Skip to main content

FedPredict is a personalization plugin for Federated Learning methods.

Project description

Welcome to FedPredict

The first-ever Federated Learning plugin!

FedPredict is a Federated Learning (FL) plugin that can significantly improve FL solutions without requiring additional training or expensive processing. FedPredict enables personalization for traditional methods, such as FedAvg and FedYogi. It is also a modular plugin that operates in the prediction stage of FL without requiring any modification in the training step. This project has been developed in the laboratories WISEMAP (UFMG), H.IAAC (UNICAMP), and NESPED (UFV).

The list of projects that use FedPredict is the following (updating):

  • FL-H.IAAC: it has the code of the experiments of FedPredict papers in IEEE DCOSS-IoT 2023 and 2024 (i.e., FedPredict and FedPredict-Dynamic).
  • PFLib (will be available soon).
  • PyFlexe (will be available soon).

Documentation

Please access the FedPredict documentation for tutorials and API details.

Why FedPredict?

It is better working with the prediction stage. See the comparison below!

How it works?

FedPredict intelligently combines global and local model parameters. In this process, it assigns more or less weight to each type of parameter according to various factors, such as the evolution level (el) of the global model, the update level (ul) of the local model, and the similarity (s) between the old data (i.e., the one in which the model was previously trained) and the recently acquired data). Then, the client uses the combined model to make predictions over the test/val data.

Benefits

The list of benefits of the plugin as listed as follows:

  1. High performance: achieve high performance in heterogeneous data.
  2. High efficiency for FL: achieve high performance even when training less.
  3. Concept drift-awareness: FedPredict makes the model almost instantly adapt to the new scenario when concept drift occurs.
  4. Task independent: apply FedPredict for any type of deep neural network task.
  5. Easy to use and modular: no modifications are necessary in the training stage of your solution!
  6. Lightweight: it is composed of simple operations.
  7. Low downlink communication cost: FedPredict server compresses global model parameters.

Just plug and play!

Installation

FedPredict is compatible with Python>=3.8 and is tested on the latest versions of Ubuntu. With your virtual environment opened, if you are using Torch type the following command to install FedPredict from Pypi:

    pip install fedpredict[torch]

If you are using Flower for FL simulation, type:

    pip install fedpredict[flwr]

FL requirements

In general, if your solution shares some level of similarity with FedAvg, then FedPredict is ready to use. The requirements are described as follows:

Requirement Description
Sharing all layers The clients have to upload all model layers at every round so the server can aggregate a global model that can be directly leveraged by a new client, as in FedAvg
Same model structure The layers of the global and local models have to have the same shape to allow the combination of parameters
Predicting using the combined model On the client side, the original method has to be flexible enough to make predictions based on the combined model; otherwise, the plugin will have no effect

Components

Our solution has two main components: FedPredict client and FedPredict server. Their objectives are described below:

Components Objective
FedPredict Client Transfer the knowledge from the updated global model to the client's stale local model
FedPredict server Compresses the updated global model parameters to further send to the clients. Used together with FedPredict client

Citing

If FedPredict has been useful to you, please cite our paper. The BibTeX is presented as follows:

@inproceedings{capanema2023fedpredict,
  title={FedPredict: Combining Global and Local Parameters in the Prediction Step of Federated Learning},
  author={Capanema, Cl{\'a}udio GS and de Souza, Allan M and Silva, Fabr{\'\i}cio A and Villas, Leandro A and Loureiro, Antonio AF},
  booktitle={2023 19th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT)},
  pages={17--24},
  year={2023},
  doi={https://doi.org/10.1109/DCOSS-IoT58021.2023.00012},
  organization={IEEE}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fedpredict-0.0.15.tar.gz (31.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fedpredict-0.0.15-py3-none-any.whl (33.2 kB view details)

Uploaded Python 3

File details

Details for the file fedpredict-0.0.15.tar.gz.

File metadata

  • Download URL: fedpredict-0.0.15.tar.gz
  • Upload date:
  • Size: 31.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for fedpredict-0.0.15.tar.gz
Algorithm Hash digest
SHA256 7ebeb8028f18f43f451df8b0032f9cc5dbf7b2b4f82f9ac97d6d7b8d9d292701
MD5 66339f27c0252748ea7b5b04a30903a5
BLAKE2b-256 a7366006bad632bb964e4f9ca0fb708f3023b6c98b89708f9c5d8bb4e29332ce

See more details on using hashes here.

File details

Details for the file fedpredict-0.0.15-py3-none-any.whl.

File metadata

  • Download URL: fedpredict-0.0.15-py3-none-any.whl
  • Upload date:
  • Size: 33.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for fedpredict-0.0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 a2cf682bff1528c95bc5731f374bf60d83919dc796aa2d1ea566ecdc820467af
MD5 05e4a2b34c6e9890df755d7171f474ae
BLAKE2b-256 ddc278503d861f42bff0a9eb43e17803815ebe8b7869039aa65bd81d5843c183

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page