Skip to main content

A toolkit for tools and techniques related to the privacy and compliance of AI models.

Reason this release was yanked:

Distribution package error, was fixed in version 0.0.2

Project description

ai-privacy-toolkit


A toolkit for tools and techniques related to the privacy and compliance of AI models.

The first release of this toolkit contains a single module called anonymization. This module contains methods for anonymizing ML model training data, so that when a model is retrained on the anonymized data, the model itself will also be considered anonymous. This may help exempt the model from different obligations and restrictions set out in data protection regulations such as GDPR, CCPA, etc.

Official ai-privacy-toolkit documentation: https://ai-privacy-toolkit.readthedocs.io/en/latest/

Related toolkits:

ai-minimization-toolkit: A toolkit for reducing the amount of personal data needed to perform predictions with a machine learning model

differential-privacy-library: A general-purpose library for experimenting with, investigating and developing applications in, differential privacy.

adversarial-robustness-toolbox: A Python library for Machine Learning Security.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai-privacy-toolkit-0.0.1.tar.gz (5.3 kB view hashes)

Uploaded Source

Built Distribution

ai_privacy_toolkit-0.0.1-py3-none-any.whl (5.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page