Flower - A Friendly Federated Learning Framework
Project description
Flower - A Friendly Federated Learning Framework
Website |
Blog |
Docs |
Conference |
Slack
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, Hugging Face Transformers, PyTorch Lightning, MXNet, scikit-learn, TFLite, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Documentation
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face [code example])
- Quickstart (PyTorch Lightning [code example])
- Quickstart (MXNet)
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android [code example])
Flower Usage Examples
A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras:
Quickstart examples:
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face)
- Quickstart (PyTorch Lightning)
- Quickstart (MXNet)
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android)
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- Android & TFLite
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Single-Machine Simulation of Federated Learning Systems
Flower Baselines / Datasets
Experimental - curious minds can take a peek at baselines.
Community
Flower is built by a wonderful community of researchers and engineers. Join Slack to meet them, contributions are welcome.
Citation
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
Contributing to Flower
We welcome contributions. Please see CONTRIBUTING.md to get started!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file flwr-nightly-0.18.0.dev20220218.tar.gz
.
File metadata
- Download URL: flwr-nightly-0.18.0.dev20220218.tar.gz
- Upload date:
- Size: 63.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.12 CPython/3.7.12 Linux/5.11.0-1028-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 734fec1abb4f9e84a4c1a55324061954885aaf32a8f86047d5632395538aa234 |
|
MD5 | de19193a52ef2c283984fa20c026eb35 |
|
BLAKE2b-256 | 0c5c8d2cb2bafb89e69f582d6f53e01b7ba290146a59ac8fe60c6fe1ba642719 |
File details
Details for the file flwr_nightly-0.18.0.dev20220218-py3-none-any.whl
.
File metadata
- Download URL: flwr_nightly-0.18.0.dev20220218-py3-none-any.whl
- Upload date:
- Size: 105.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.12 CPython/3.7.12 Linux/5.11.0-1028-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 36c6ca3b68c67ba634753954a8769448e59345034ceea4933a7fdf8e9eed26eb |
|
MD5 | a03764b9438ec7a038f4b8a3833b0ee2 |
|
BLAKE2b-256 | 8cb0dfddc65716da03581420fd8faf420a6dd21156eaecfd5e5bb6cbf990c7e0 |