Skip to main content

Tools for the statistical disclosure control of machine learning models

Project description

License Latest Version DOI Codacy Badge codecov PyPI package Python versions

AI-SDC

A collection of tools and resources for managing the statistical disclosure control of trained machine learning models. For a brief introduction, see Smith et al. (2022).

Content

  • aisdc
    • attacks Contains a variety of privacy attacks on machine learning models, including membership and attribute inference.
    • preprocessing Contains preprocessing modules for test datasets.
    • safemodel The safemodel package is an open source wrapper for common machine learning models. It is designed for use by researchers in Trusted Research Environments (TREs) where disclosure control methods must be implemented. Safemodel aims to give researchers greater confidence that their models are more compliant with disclosure control.
  • docs Contains Sphinx documentation files.
  • example_notebooks Contains short tutorials on the basic concept of "safe_XX" versions of machine learning algorithms, and examples of some specific algorithms.
  • examples Contains examples of how to run the code contained in this repository:
    • How to simulate attribute inference attacks attribute_inference_example.py.
    • How to simulate membership inference attacks:
      • Worst case scenario attack worst_case_attack_example.py.
      • LIRA scenario attack lira_attack_example.py.
    • Integration of attacks into safemodel classes safemodel_attack_integration_bothcalls.py.
  • risk_examples Contains hypothetical examples of data leakage through machine learning models as described in the Green Paper.
  • tests Contains unit tests.

Documentation

Documentation is hosted here: https://ai-sdc.github.io/AI-SDC/


This work was funded by UK Research and Innovation Grant Number MC_PC_21033 as part of Phase 1 of the DARE UK (Data and Analytics Research Environments UK) programme (https://dareuk.org.uk/), delivered in partnership with HDR UK and ADRUK. The specific project was Guidelines and Resources for AI Model Access from TrusTEd Research environments (GRAIMATTER).­ This project has also been supported by MRC and EPSRC [grant number MR/S010351/1]: PICTURES.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aisdc-1.0.1.post2.tar.gz (57.2 kB view details)

Uploaded Source

Built Distribution

aisdc-1.0.1.post2-py3-none-any.whl (66.1 kB view details)

Uploaded Python 3

File details

Details for the file aisdc-1.0.1.post2.tar.gz.

File metadata

  • Download URL: aisdc-1.0.1.post2.tar.gz
  • Upload date:
  • Size: 57.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.6

File hashes

Hashes for aisdc-1.0.1.post2.tar.gz
Algorithm Hash digest
SHA256 1e47880030e7ef208be2d22b983be35a31e15c3cfc6a9e5093c11dfc43c31f7e
MD5 08f7d2d906a2dc425830e71f756792e5
BLAKE2b-256 8ce612819aac46d3613c91fcdd99ff6b4113dc9fbc1ee77a983007aa6172f719

See more details on using hashes here.

File details

Details for the file aisdc-1.0.1.post2-py3-none-any.whl.

File metadata

  • Download URL: aisdc-1.0.1.post2-py3-none-any.whl
  • Upload date:
  • Size: 66.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.6

File hashes

Hashes for aisdc-1.0.1.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 3586e2e6c4edb44bb72a182dac8cd0dff15d81bc43087fc3a6bfb83a9e2470aa
MD5 5794693859ac767f8b3b83feab9e9ff5
BLAKE2b-256 6f8348592ac6a9d08ad3076396c8974e3a3f2036c1ac152dec161efcd3edd627

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page