Federated Learning for the Edge
Project description
Open Federated Learning (OpenFL) is a Python 3 framework for Federated Learning. OpenFL is designed to be a flexible, extensible and easily learnable tool for data scientists. OpenFL is hosted by The Linux Foundation, aims to be community-driven, and welcomes contributions back to the project.
Looking for the Open Flash Library project also referred to as OpenFL? Find it here!
Installation
You can simply install OpenFL from PyPI:
$ pip install openfl
For more installation options check out the online documentation.
Getting Started
OpenFL enables data scientists to set up a federated learning experiment following one of the workflows:
-
Director-based Workflow: Setup long-lived components to run many experiments in series. Recommended for FL research when many changes to model, dataloader, or hyperparameters are expected
-
Aggregator-based Workflow: Define an experiment and distribute it manually. All participants can verify model code and FL plan prior to execution. The federation is terminated when the experiment is finished
-
Workflow Interface (experimental): Create complex experiments that extend beyond traditional horizontal federated learning. See the experimental tutorials to learn how to coordinate aggregator validation after collaborator model training, perform global differentially private federated learning, measure the amount of private information embedded in a model after collaborator training with privacy meter, or add a watermark to a federated model.
The quickest way to test OpenFL is to follow our tutorials.
Read the blog post explaining steps to train a model with OpenFL.
Check out the online documentation to launch your first federation.
Requirements
- Ubuntu Linux 18.04+
- Python 3.7+ (recommended to use with Virtualenv).
OpenFL supports training with TensorFlow 2+ or PyTorch 1.3+ which should be installed separately. User can extend the list of supported Deep Learning frameworks if needed.
Project Overview
What is Federated Learning
Federated learning is a distributed machine learning approach that enables collaboration on machine learning projects without having to share sensitive data, such as, patient records, financial data, or classified information. The minimum data movement needed across the federation is solely the model parameters and their updates.
Background
OpenFL builds on a collaboration between Intel and the Bakas lab at the University of Pennsylvania (UPenn) to develop the Federated Tumor Segmentation (FeTS, www.fets.ai) platform (grant award number: U01-CA242871).
The grant for FeTS was awarded from the Informatics Technology for Cancer Research (ITCR) program of the National Cancer Institute (NCI) of the National Institutes of Health (NIH), to Dr Spyridon Bakas (Principal Investigator) when he was affiliated with the Center for Biomedical Image Computing and Analytics (CBICA) at UPenn and now heading up the Division of Computational Pathology at Indiana University (IU).
FeTS is a real-world medical federated learning platform with international collaborators. The original OpenFederatedLearning project and OpenFL are designed to serve as the backend for the FeTS platform, and OpenFL developers and researchers continue to work very closely with IU on the FeTS project. An example is the FeTS-AI/Front-End, which integrates the group’s medical AI expertise with OpenFL framework to create a federated learning solution for medical imaging.
Although initially developed for use in medical imaging, OpenFL designed to be agnostic to the use-case, the industry, and the machine learning framework.
You can find more details in the following articles:
- Pati S, et al., 2022
- Reina A, et al., 2021
- Sheller MJ, et al., 2020
- Sheller MJ, et al., 2019
- Yang Y, et al., 2019
- McMahan HB, et al., 2016
Supported Aggregation Algorithms
Algorithm Name | Paper | PyTorch implementation | TensorFlow implementation | Other frameworks compatibility | How to use |
---|---|---|---|---|---|
FedAvg | McMahan et al., 2017 | ✅ | ✅ | ✅ | docs |
FedProx | Li et al., 2020 | ✅ | ✅ | ❌ | docs |
FedOpt | Reddi et al., 2020 | ✅ | ✅ | ✅ | docs |
FedCurv | Shoham et al., 2019 | ✅ | ❌ | ❌ | docs |
Support
Please join us for our bi-monthly community meetings starting December 1 & 2, 2022!
Meet with some of the OpenFL team members behind OpenFL.
We will be going over our roadmap, open for Q&A, and welcome idea sharing.
Calendar and links to a Community calls are here
Subscribe to the OpenFL mail list openfl-announce@lists.lfaidata.foundation
See you there!
We also always welcome questions, issue reports, and suggestions via:
License
This project is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Citation
@article{openfl_citation,
author={Foley, Patrick and Sheller, Micah J and Edwards, Brandon and Pati, Sarthak and Riviera, Walter and Sharma, Mansi and Moorthy, Prakash Narayana and Wang, Shi-han and Martin, Jason and Mirhaji, Parsa and Shah, Prashant and Bakas, Spyridon},
title={OpenFL: the open federated learning library},
journal={Physics in Medicine \& Biology},
url={http://iopscience.iop.org/article/10.1088/1361-6560/ac97d9},
year={2022},
doi={10.1088/1361-6560/ac97d9},
publisher={IOP Publishing}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openfl-1.6.tar.gz
.
File metadata
- Download URL: openfl-1.6.tar.gz
- Upload date:
- Size: 11.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 88e2c6b59be85a8308925a195772981bea0ce13208dab7616a3e718641531d08 |
|
MD5 | 6132401e6bf902ae9295c0d7fe2e7f4b |
|
BLAKE2b-256 | 05d5867b8a36abab4056b2b1263fc44c53a2436c16f27f4fc3a0514ed3e8c252 |
File details
Details for the file openfl-1.6-py3-none-any.whl
.
File metadata
- Download URL: openfl-1.6-py3-none-any.whl
- Upload date:
- Size: 11.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f0d0fffe73f9933b54299d7c162d12b61ffdde59e9bf70358a8ddd62060f15e7 |
|
MD5 | c628f768b908c5db561d0e2ef87d1d95 |
|
BLAKE2b-256 | e53362b50a74d6e5ad746690bc580131b01abb2d633fc4cd69c72bcec4b97369 |