A library aiding to create deanonymizers (attacks on privacy preserving machine learning models) for the AnoMed competition platform.
Project description
Deanonymizer
A library aiding to create attacks against anonymizers (privacy preserving machine learning models) for the AnoMed competition platform. Currently, only membership inference attacks are supported.
Usage Example
The following example will create a Falcon-based web app that encapsulates a deanonymizer, targeting the example anonymizer defined in the anomed-anonyimzer README.md (which is a privacy preserving classifier for the famous iris dataset classification problem). The encapsulated deanonymizer is a membership inference black box attack, implemented using the Adversarial Robustness Toolbox (ART library).
The web app offers these routes (some may have query parameters not mentioned here):
[GET] / (This displays an "alive message".)
[POST] /fit (This invokes fitting the Gaussian naive based classifier; the web app will pull the training data from training_data_url.)
[POST] /evaluate (This invokes an intermediate, or final evaluation of the classifier.)
import anomed_deanonymizer
import numpy as np
from art.attacks.inference.membership_inference import MembershipInferenceBlackBox
def validate_input_array(feature_array: np.ndarray) -> None:
if feature_array.shape[1] != 4 or len(feature_array.shape) != 2:
raise ValueError("Feature array needs to have shape (n_samples, 4).")
if feature_array.dtype != np.float_:
raise ValueError("Feature array must be an array of floats.")
attack_target = anomed_deanonymizer.WebClassifier(
url="http://example.com/predict", input_shape=(4,), nb_classes=3
)
example_attack_art = MembershipInferenceBlackBox(estimator=attack_target) # type: ignore
example_attack = anomed_deanonymizer.ARTWrapper(
art_mia=example_attack_art, input_validator=validate_input_array
)
application = anomed_deanonymizer.supervised_learning_MIA_server_factory(
anonymizer_identifier="example_anonymizer",
deanonymizer_identifier="example_deanonymizer",
deanonymizer_obj=example_attack,
model_filepath="deanonymizer.pkl",
default_batch_size=64,
member_url="http://example.com/members",
nonmember_url="http://example.com/non-members",
evaluation_data_url="http://example.com/attack-success-evaluation",
model_loader=anomed_deanonymizer.unpickle_deanonymizer,
utility_evaluation_url="http://example.com/utility",
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file anomed_deanonymizer-0.0.7.tar.gz.
File metadata
- Download URL: anomed_deanonymizer-0.0.7.tar.gz
- Upload date:
- Size: 10.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2cfdce7147d8053dfaccea514eaff204fb6cbb0f50f9bfb8a10c74a7c5476ffe
|
|
| MD5 |
96b57dc7831198a1342e5723351ccbe7
|
|
| BLAKE2b-256 |
945e057c27ef1c7cfcd3da5f1838a92b11ef66566b753fcd71fb8eb3149f6872
|
File details
Details for the file anomed_deanonymizer-0.0.7-py3-none-any.whl.
File metadata
- Download URL: anomed_deanonymizer-0.0.7-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fa50059ca06330d3c0ddbb1fa6eedde36040015709487f46c51e6fb4c69c5910
|
|
| MD5 |
9f4b7c3818d68a97a45a652e60ed64d5
|
|
| BLAKE2b-256 |
00c66a5445d7d7d77d1f28c81453c741c3862b8ad1dd81e28385f382ea566d2b
|