Flower - A Friendly Federated Learning Framework
Project description
Flower - A Friendly Federated Learning Framework
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, MXNet, scikit-learn, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Documentation
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (MXNet)
- Quickstart (scikit-learn [code example])
Flower Usage Examples
A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras:
Quickstart examples:
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Single-Machine Simulation of Federated Learning Systems
- Federated learning example with scikit-learn
Flower Baselines / Datasets
Coming soon - curious minds can take a peek at baselines.
Citation
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
Contributing to Flower
We welcome contributions. Please see CONTRIBUTING.md to get started!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for flwr-nightly-0.17.0.dev20210818.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 03c6d58edb066352adad31413d08ac0f5c7e751bc906d16b84cc86b369d055d3 |
|
MD5 | d768522cb9c4a0dfcfc3949f9b2353fc |
|
BLAKE2b-256 | 9d397ac57d8abccd356bef187d668862019737d70a50ee5de9069952ecb818e1 |
Hashes for flwr_nightly-0.17.0.dev20210818-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 74b493c56ed92c1e64f52c8b50ccd4d8b982eadd1c2445d74119a09ba5483ce9 |
|
MD5 | cb258353814315616d414f738a381257 |
|
BLAKE2b-256 | dce0113afd441bd12d30016f6c5d825067df408ca9d934c3b5a3cb7d0eab323e |