Flower - A Friendly Federated Learning Framework
Project description
Flower - A Friendly Federated Learning Framework
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, MXNet, scikit-learn, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Documentation
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (MXNet)
- Quickstart (scikit-learn [code example])
Flower Usage Examples
A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras:
Quickstart examples:
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Single-Machine Simulation of Federated Learning Systems
- Federated learning example with scikit-learn
Flower Baselines / Datasets
Coming soon - curious minds can take a peek at baselines.
Citation
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
Contributing to Flower
We welcome contributions. Please see CONTRIBUTING.md to get started!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for flwr-nightly-0.17.0.dev20210902.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 59d57ec518892eb892ead490654da135c4ab51d99dd1085e4638f5fd29a9663f |
|
MD5 | 456244efccf332f7fe24d3b5acf293cc |
|
BLAKE2b-256 | a3bedeff0fb662691689ff8df92a2e0b49ea43c801cc306b4e86972e8575518f |
Hashes for flwr_nightly-0.17.0.dev20210902-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b70eed10dbeafd4e848d55ec92aa82e8fc8f16a3368d6286cbaa672feaf6a2e5 |
|
MD5 | 255aa451f3bfde20c15b6f056fffcf3d |
|
BLAKE2b-256 | 66adbb08555c06bf594e821d92c58991b273559ccfb1b3fceb10933defa6cb3f |