Skip to main content

Simulation framework for Private Federated Learning

Project description

pfl: Python framework for Private Federated Learning simulations

GitHub License CircleCI PyPI - Python Version

Documentation website: https://apple.github.io/pfl-research

pfl is a Python framework developed at Apple to empower researchers to run efficient simulations with privacy-preserving federated learning (FL) and disseminate the results of their research in FL. We are a team comprising engineering and research expertise, and we encourage researchers to publish their papers, with this code, with confidence.

The framework is not intended to be used for third-party FL deployments but the results of the simulations can be tremendously useful in actual FL deployments. We hope that pfl will promote open research in FL and its effective dissemination.

pfl provides several useful features, including the following:

  • Get started quickly trying out PFL for your use case with your existing model and data.
  • Iterate quickly with fast simulations utilizing multiple levels of distributed training (multiple processes, GPUs and machines).
  • Flexibility and expressiveness - when a researcher has a PFL idea to try, pfl has flexible APIs to express these ideas.
  • Scalable simulations for large experiments with state-of-the-art algorithms and models.
  • Support both PyTorch and TensorFlow.
  • Unified benchmarks for datasets that have been vetted for both PyTorch and TensorFlow.
  • Support other models in addition to neural networks, e.g. GBDTs. Switching between types of models is seamless.
  • Tight integration with privacy features, including common mechanisms for local and central differential privacy.

Results from benchmarks are maintained in this Weights & Biases report.

Installation

Installation instructions can be found here. pfl is available on PyPI and a full installation be done with pip:

pip install 'pfl[tf,pytorch,trees]'

Getting started - tutorial notebooks

To try out pfl immediately without installation, we provide several colab notebooks for learning the different components in pfl hands-on.

  • Introduction to Federated Learning with CIFAR10 and TensorFlow.
  • Introduction to PFL research with FLAIR and PyTorch.
  • Introduction to Differential Privacy (DP) with Federated Learning.
  • Creating Federated Dataset for PFL Experiment.

We also support MLX!

But you have to run this notebook locally on your Apple silicon, see all Jupyter notebooks available here.

Getting started - benchmarks

pfl aims to streamline the benchmarking process of testing hypotheses in the Federated Learning paradigm. The official benchmarks are available in the benchmarks directory, using a variety of realistic dataset-model combinations with and without differential privacy (yes, we do also have CIFAR10).

Copying these examples is a great starting point for doing your own research. See the quickstart on how to start converging a model on the simplest benchmark (CIFAR10) in just a few minutes.

Contributing

Researchers are invited to contribute to the framework. Please, see here for more details.

Citing pfl-research

@article{granqvist2024pfl,
  title={pfl-research: simulation framework for accelerating research in Private Federated Learning},
  author={Granqvist, Filip and Song, Congzheng and Cahill, {\'A}ine and van Dalen, Rogier and Pelikan, Martin and Chan, Yi Sheng and Feng, Xiaojun and Krishnaswami, Natarajan and Jina, Vojta and Chitnis, Mona},
  journal={arXiv preprint arXiv:2404.06430},
  year={2024},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pfl-0.5.0.tar.gz (170.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pfl-0.5.0-py3-none-any.whl (232.0 kB view details)

Uploaded Python 3

File details

Details for the file pfl-0.5.0.tar.gz.

File metadata

  • Download URL: pfl-0.5.0.tar.gz
  • Upload date:
  • Size: 170.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pfl-0.5.0.tar.gz
Algorithm Hash digest
SHA256 a926e0f73b53518aca6c39f3e00ada57a55a7dd28e9c3276c03abf8396fd0bc7
MD5 65af24021e6c527022378b9d8388be8b
BLAKE2b-256 b8d47e9e044ce6e845d81cd4493fce61eebd2c206a5598c324a4c1a9d4f6ac17

See more details on using hashes here.

Provenance

The following attestation bundles were made for pfl-0.5.0.tar.gz:

Publisher: publish-wheel.yaml on apple/pfl-research

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pfl-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: pfl-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 232.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pfl-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c2d5a2dd03ebec01d3177f9790cad7cf4efc14a7f7345ef18339b80c3ba927d3
MD5 6a96b5c603f0bc608c92b19d2cbd9dd6
BLAKE2b-256 a615634fea49495ac6157c10807be3454e618db3f29a378f05292ee8d934f842

See more details on using hashes here.

Provenance

The following attestation bundles were made for pfl-0.5.0-py3-none-any.whl:

Publisher: publish-wheel.yaml on apple/pfl-research

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page