Skip to main content

A server and client to use the flower framework with DPSA.

Project description

dpsa4flower

Server and client to use the flower framework for differentially private federated learning with secure aggregation.

Made to be used with the dpsa infrastructure, head there for an explanation of the system's participants and properties. Setup of additional aggregation servers is required, head here for instructions.

Installation

To install, you require the following packages:

  • python version 3.9 or higher.
  • poetry package manager for python

Once you have those, go ahead and clone this repository:

> git clone https://github.com/dpsa-project/dpsa4flower.git

Enter the new directory:

> cd dpsa4flower

Use poetry to create a virtualenv and install all dependencies:

> poetry shell
> poetry install

You're ready to use our classes now. Note that to actually run a learning task, you will need to provide locations at which two seperate dpsa4fl aggregation servers are running. See here for instructions or check out our example project.

Example code

There is a repo containing an example implementation learning the CIFAR task using a torch model, where learning is federated using flower with differential privacy and secure aggregation.

Classes

This package exposes two classes, one for the server and one for the client.

DPSAServer(model_size, privacy_parameter, granularity, aggregator1_location, aggregator2_location, client_manager, strategy)

The dpsa4flower server class extends the flower server class with the necessities for using DPSA for aggregation. It handles configuration of the aggregator servers, reshaping of the collected aggregation results, and redistributing the updates to the clients. Construction requires the following parameters as keyword arguments:

  • model_size: int The number of parameters of the model to be trained.
  • privacy_parameter: float The desired privacy per learning step. One aggregation step will be 1/2*privacy_parameter^2 zero-concentrated differentially private for each client.
  • granularity: int The resolution of the fixed-point encoding used for secure aggregation. A larger value will result in a less lossy representation and more communication and computation overhead. Currently, 16, 32 and 64 bit are supported.
  • aggregator1_location: str Location of the first aggregator server in URL format including the port. For example, for a server running locally: "http://127.0.0.1:9991"
  • aggregator2_location: str Location of the second aggregator server in URL format including the port. For example, for a server running locally: "http://127.0.0.1:9992"
  • client_manager: flwr.server.ClientManager A flower client manager to manage connected clients.
  • strategy: Optional[flwr.server.strategy.Strategy] A flower strategy for the server to use. It will be wrapped replacing the configure_fit and aggregate_fit methods with ones that interact with the dpsa infrastructure.

An example construction of a dpsa4flower server object would look like this:

dpsa4flower.DPSAServer(
        model_size = 62006,
        privacy_parameter = 30,
        granularity = 32,
        aggregator1_location = "http://127.0.0.1:9981",
        aggregator2_location = "http://127.0.0.1:9982",
        client_manager=flwr.server.SimpleClientManager(),
)

The created object can then be used to start a flower server using flwr.server.start_server as usual.

DPSANumPyClient(max_privacy_per_round, aggregator1_location, aggregator2_location, client, allow_evaluate)

The dpsa4flower client class implements the NumPyClient interface provided by flower. It's a wrapper for existing NumPyClients adding secure aggregation and differential privacy. The wrapped client is used for local training, results are then submitted to the secure aggregation infrastructure in an encrypted fashion. The constructor requires the following parameters as keyword arguments:

  • max_privacy_per_round: float The maximal zero-contentrated differential privacy budget allowed to be spent on a single round of training. If the selected server offers a weaker guarantee, no data will be submitted and an exception will be raised.
  • aggregator1_location: str Location of the first aggregator server in URL format including the port. For example, for a server running locally: "http://127.0.0.1:9991"
  • aggregator2_location: str Location of the second aggregator server in URL format including the port. For example, for a server running locally: "http://127.0.0.1:9992"
  • client: flower.client.numpy_client.NumPyClient The NumPyClient used for executing the local learning tasks.
  • allow_evaluate: bool Evaluation is a privacy-relevant operation on the client dataset. If this flag is set to False, evaluation always reports infinite loss and zero accuracy to the server. Otherwise, the evaluation function of the wrapped client will be used and the results will be released to the server, potentially compromising privacy. Defaults to False.

An example construction of a dpsa4flower client object would look like this:

dpsa4flower.DPSANumPyClient(
    max_privacy_per_round = 30,
    aggregator1_location = "http://127.0.0.1:9981",
    aggregator2_location = "http://127.0.0.1:9982",
    client = FlowerClient()
)

where FlowerClient is some NumPyClient of your choice. It can then be started using flwr.client.start_numpy_client as usual.

What's going on

When using our classes in the setup described (and used in the example project), the training procedure takes place as described in this diagram:

                                             gradient sum
          ┌───────────────────────────────────────────────────────────────────────────────┐
          │                             (differentially private)                          │
          │                                                                               │
          │                                                                               │
          │                                                                               │
          │                                                                               │
          │             gradient shares                                                   │
          │              (ciphertext)                                                     │
  ┌───────▼─────────┬────────────────────┐                                                │
  │ DPSANumPyClient │                    │ ┌──────────────┐                               │
  └─────────────────┴────────────────┐   └─►              │                               │
                                     │     │ Aggregator 1 ├───┐                           │
                                 ┌───)─────►              │   │                           │
                                 │   │     └──────────────┘   │                   ┌───────┴───────┐
          .                      │   │                        │    gradient sum   │               │
          .                      │   │                        ├───────────────────►  DPSAServer   │
          .                      │   │                        │  (differentially  │               │
                                 │   │     ┌──────────────┐   │      private)     └───────────────┘
                                 │   └─────►              │   │
                                 │         │ Aggregator 2 ├───┘
  ┌─────────────────┬────────────┘   ┌─────►              │
  │ DPSANumPyClient │                │     └──────────────┘
  └─────────────────┴────────────────┘
                     gradient shares
                      (ciphertext)



     flower clients                     dpsa4fl infrastructure                     flower server
     --------------                     ----------------------                     -------------
compute gradients locally          checks if clipping was done properly,        collects aggregate,
   on sensitive data,               computes aggregate on ciphertext,           distributes updates
clip to norm 1 and submit          adds noise for differential privacy.         back to the clients
                                     ciphertext can not be decrypted
                                     if the servers don't collaborate

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dpsa4flower-0.6.1.tar.gz (11.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dpsa4flower-0.6.1-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file dpsa4flower-0.6.1.tar.gz.

File metadata

  • Download URL: dpsa4flower-0.6.1.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.8 Linux/5.15.109

File hashes

Hashes for dpsa4flower-0.6.1.tar.gz
Algorithm Hash digest
SHA256 713695187a962647b1a9226e625630f9363144edffd00c4d5842c78dd8e73a20
MD5 c31f75b0feeda9825bd4482e8a390c90
BLAKE2b-256 80ca2813beb23b844b50eb821e0896121d3a25f54f35054d8511cb1550ad56b9

See more details on using hashes here.

File details

Details for the file dpsa4flower-0.6.1-py3-none-any.whl.

File metadata

  • Download URL: dpsa4flower-0.6.1-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.8 Linux/5.15.109

File hashes

Hashes for dpsa4flower-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f1d539fe92908e690969f71fab96ae6dd519913d947140e8b670abc55633b1bc
MD5 a9a84ebe551cd0772862e1e368ecef12
BLAKE2b-256 a7476e786cbd4f9604565b2645fea75d795b63218757d329ade8c69e22a7d5a2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page