Flower - A Friendly Federated Learning Framework
Project description
Flower - A Friendly Federated Learning Framework
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, MXNet, scikit-learn, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Documentation
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (MXNet)
- Quickstart (scikit-learn [code example])
Flower Usage Examples
A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras:
Quickstart examples:
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Single-Machine Simulation of Federated Learning Systems
- Federated learning example with scikit-learn
Flower Baselines / Datasets
Coming soon - curious minds can take a peek at baselines.
Citation
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
Contributing to Flower
We welcome contributions. Please see CONTRIBUTING.md to get started!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for flwr-nightly-0.17.0.dev20210917.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 47972925337b56520cb6b785f506661577000146567bdc5304b05e8da76279eb |
|
MD5 | 51263bf412d3b33d9092429d797a01fd |
|
BLAKE2b-256 | 48d6deef9366a922bf6ae84b2f22bfdc657fbf6cc907179b4c964a4dd0fa9279 |
Hashes for flwr_nightly-0.17.0.dev20210917-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 79007a71d3e38a4e4684465fb0b042af00616fcb3e767d286a77166381a058f9 |
|
MD5 | d05cd099c89a50b982faf4052dafeca4 |
|
BLAKE2b-256 | f2b589c593d07025dc226d69fbd963c842792f5408ea48a9a036a141fffe14ef |