DP-MLM: Differentially Private Text Rewriting Using Masked Language Models
Project description
This is the code and package repository for the ACL 2024 Findings paper: DP-MLM: Differentially Private Text Rewriting Using Masked Language Models
Setup
Installation
You can install the package directly using:
pip install dpmlm
Optionally, you can install from source. In this repository, you will find a requirements.txt file, which contains all necessary Python dependencies.
Resource Bootstrapping
Before running the mechanism, you need to download the necessary NLTK libraries:
from dpmlm.utils import setup_resources
setup_resources()
Usage of DP-MLM
The core logic resides in the DPMLM class. You can now initialize it with custom calibration bounds to ensure the DP privatization is tuned to your specific model (and bounding strategy).
from dpmlm import DPMLM
from dpmlm.utils import calculate_logit_bounds
# 1. (Optional) Calibrate bounds for your specific model (e.g., RoBERTa)
bounds = calculate_logit_bounds("FacebookAI/roberta-base")
# 2. Instantiate the mechanism
M = DPMLM(MODEL="FacebookAI/roberta-base", calibration=bounds, bound_strategy=None)
# 3. Rewrite text
private_text = M.dpmlm_rewrite("Hello world, this is a private text.", epsilon=25)
If you want to set a bounding strategy for the clip bounds (beyond simple min/max selection), you can do so by passing a lambda function:
# strategy as used in the paper
strategy = lambda mean, std, low, high: (mean, mean + 4*std)
M = DPMLM(MODEL="FacebookAI/roberta-base", calibration=bounds, bound_strategy=strategy)
DP-MLM Batched Mode
For longer documents, the batched mode provides significant performance increases by parallelizing masked token predictions on the GPU.
To use batching, simply run:
M.dpmlm_rewrite_batch("Large document text...", epsilon=25, batch_size=16)
Depending on your setup, you may need to tweak the batch_size parameter for the most optimal performance gains.
Input Document Length
As of the newest 2025 release, DP-MLM no longer has the shortcoming of the 512 token context window (256 with concatentation), which was due to the limitations of MLM context windows.
Now, DP-MLM operates with a sliding window, where the maximum context is given, centered around the target word to be privatized. Thus, DP-MLM now works on arbitrarily long documents!
Usage of other evaluated models
There is one other included file for replication of the paper, which is easily importable and reusable:
LLMDP.py: implementations of bothDP-ParaphraseandDP-Prompt. Note that forDP-Prompt, you will need to download the corresponding LMs, i.e., from Hugging Face.
M = LLMDP.DPPrompt()
M.privatize("hello world", epsilon=100)
Important note
In order to use LLMDP.DPParaphrase, you must download the fine-tuned model directory.
This can be found at the following link: Model
Citation
Please consider citing the original work that introduced DP-MLM. Thank you!
@inproceedings{meisenbacher-etal-2024-dp,
title = "{DP}-{MLM}: Differentially Private Text Rewriting Using Masked Language Models",
author = "Meisenbacher, Stephen and
Chevli, Maulik and
Vladika, Juraj and
Matthes, Florian",
editor = "Ku, Lun-Wei and
Martins, Andre and
Srikumar, Vivek",
booktitle = "Findings of the Association for Computational Linguistics: ACL 2024",
month = aug,
year = "2024",
address = "Bangkok, Thailand",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.findings-acl.554/",
doi = "10.18653/v1/2024.findings-acl.554",
pages = "9314--9328"
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dpmlm-1.0.4.tar.gz.
File metadata
- Download URL: dpmlm-1.0.4.tar.gz
- Upload date:
- Size: 265.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
70ad368456545358ef575403db8e4153abf091f463864ae83f37dfae96d00810
|
|
| MD5 |
2e8ee0fdc73756f054b96e4fd232a679
|
|
| BLAKE2b-256 |
b3cdb2f48b1ceac950e6d93f41f8868b671f7a2d96e0da3a3cf6e2836d80c2ac
|
File details
Details for the file dpmlm-1.0.4-py3-none-any.whl.
File metadata
- Download URL: dpmlm-1.0.4-py3-none-any.whl
- Upload date:
- Size: 265.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f3b835efb73881b95422b77141e6cc0acde7c8bbfad24caee9d759bb0c825e53
|
|
| MD5 |
969bf6d68d50bf8856d06fd7703111e4
|
|
| BLAKE2b-256 |
677385ff47372950fb949b6221887fd0bbbc7d3d65956eb26253368a70c7d57e
|