Flower - A Friendly Federated Learning Framework
Project description
Flower - A Friendly Federated Learning Framework
Website |
Blog |
Docs |
Conference |
Slack
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, Hugging Face Transformers, PyTorch Lightning, MXNet, scikit-learn, JAX, TFLite, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Documentation
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face [code example])
- Quickstart (PyTorch Lightning [code example])
- Quickstart (MXNet)
- Quickstart (JAX [code example])
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android [code example])
Flower Baselines
Flower Baselines is a collection of community-contributed experiments that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas:
- FedBN: Federated Learning on non-IID Features via Local Batch Normalization:
- Adaptive Federated Optimization
Check the Flower documentation to learn more: Using Baselines
The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: Contributing Baselines
Flower Usage Examples
A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras:
Quickstart examples:
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face)
- Quickstart (PyTorch Lightning)
- Quickstart (MXNet)
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android)
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- Android & TFLite
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- JAX: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Single-Machine Simulation of Federated Learning Systems
Community
Flower is built by a wonderful community of researchers and engineers. Join Slack to meet them, contributions are welcome.
Citation
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
Contributing to Flower
We welcome contributions. Please see CONTRIBUTING.md to get started!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for flwr-nightly-0.19.0.dev20220325.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | fd3c3d35ef3d33205aefb7670c3622895742ab0bbbd0e83f071c71ce79a53757 |
|
MD5 | e29278dda226e0c6fd92589f4e0e81d1 |
|
BLAKE2b-256 | 8c1d5ee817e69610260464e6e5c71189af83deed189f1614a19e9c9cf516642c |
Hashes for flwr_nightly-0.19.0.dev20220325-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bd584313ba7e28fb3e2eaf258ea203d101d4a96c85aac9b103ade30e4b62d617 |
|
MD5 | 9971addebb9e2d4492e6cff177b5988a |
|
BLAKE2b-256 | 569adfc0a082eab3a71977c6b79efe445347aca630d671ff917e8c4b084dd9d3 |