Tools for the statistical disclosure control of machine learning models
Project description
AI-SDC
A collection of tools and resources for managing the statistical disclosure control of trained machine learning models. For a brief introduction, see Smith et al. (2022).
Content
aisdc
attacks
Contains a variety of privacy attacks on machine learning models, including membership and attribute inference.preprocessing
Contains preprocessing modules for test datasets.safemodel
The safemodel package is an open source wrapper for common machine learning models. It is designed for use by researchers in Trusted Research Environments (TREs) where disclosure control methods must be implemented. Safemodel aims to give researchers greater confidence that their models are more compliant with disclosure control.
docs
Contains Sphinx documentation files.example_notebooks
Contains short tutorials on the basic concept of "safe_XX" versions of machine learning algorithms, and examples of some specific algorithms.examples
Contains examples of how to run the code contained in this repository:- How to simulate attribute inference attacks
attribute_inference_example.py
. - How to simulate membership inference attacks:
- Worst case scenario attack
worst_case_attack_example.py
. - LIRA scenario attack
lira_attack_example.py
.
- Worst case scenario attack
- Integration of attacks into safemodel classes
safemodel_attack_integration_bothcalls.py
.
- How to simulate attribute inference attacks
risk_examples
Contains hypothetical examples of data leakage through machine learning models as described in the Green Paper.tests
Contains unit tests.
Documentation
Documentation is hosted here: https://ai-sdc.github.io/AI-SDC/
Quick Start
Development
Clone the repository and install the dependencies (safest in a virtual env):
$ git clone https://github.com/AI-SDC/AI-SDC.git
$ cd AI-SDC
$ pip install -r requirements.txt
Then run the tests:
$ pip install pytest
$ pytest .
Or run an example:
$ python -m examples.lira_attack_example
Installation / End-user
Install aisdc
(safest in a virtual env) and manually copy the examples
and example_notebooks
.
$ pip install aisdc
Then to run an example:
$ python attribute_inference_example.py
Or start up jupyter notebook
and run an example.
Alternatively, you can clone the repo and install:
$ git clone https://github.com/AI-SDC/AI-SDC.git
$ cd AI-SDC
$ pip install .
This work was funded by UK Research and Innovation Grant Number MC_PC_21033 as part of Phase 1 of the DARE UK (Data and Analytics Research Environments UK) programme (https://dareuk.org.uk/), delivered in partnership with HDR UK and ADRUK. The specific project was Guidelines and Resources for AI Model Access from TrusTEd Research environments (GRAIMATTER). This project has also been supported by MRC and EPSRC [grant number MR/S010351/1]: PICTURES.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aisdc-1.0.4.tar.gz
.
File metadata
- Download URL: aisdc-1.0.4.tar.gz
- Upload date:
- Size: 84.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4e4cade211e85e5f35bc38f9cffcab1870c496034bf2c7badeaeda7a4d54fad1 |
|
MD5 | a51922b7df9357c9b57a019f78c2ffdc |
|
BLAKE2b-256 | e1fdeaa2fa2f0e8d1c77df5c13183e5e2660fd6854bf2554bbc28b272711f195 |
File details
Details for the file aisdc-1.0.4-py3-none-any.whl
.
File metadata
- Download URL: aisdc-1.0.4-py3-none-any.whl
- Upload date:
- Size: 70.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 33a088e110980c6f985141b31b6426b0ce619194c498252f24359b335d2b1c8c |
|
MD5 | 41c3e5f3f2d5928d2ecc052bbd2dae82 |
|
BLAKE2b-256 | 3a51e932af59b5fdb64f366a6d5804c21d69afe3f46954415df0683d03268e7c |