Skip to main content

Fast (and cheeky) differentially private gradient-based optimisation in PyTorch

Project description

deepee

deepee is a library for differentially private deep learning in PyTorch. More precisely, deepee implements the Differentially Private Stochastic Gradient Descent (DP-SGD) algorithm originally described by Abadi et al.. Despite the name, deepee works with any (first order) optimizer, including Adam, AdaGrad, etc.

It wraps a regular PyTorch model and takes care of calculating per-sample gradients, clipping, noising and accumulating gradients with an API which closely mimics the PyTorch API of the original model.

Check out the documentation here

For paper readers

If you would like to reproduce the results from our paper, please go here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepee-0.1.9.tar.gz (17.4 kB view hashes)

Uploaded Source

Built Distribution

deepee-0.1.9-py3-none-any.whl (19.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page