Skip to main content

DP-MLM: Differentially Private Text Rewriting Using Masked Language Models

Project description

DP-MLM

PyPI version GitHub stars License

This is the code and package repository for the ACL 2024 Findings paper: DP-MLM: Differentially Private Text Rewriting Using Masked Language Models

Setup

Installation

You can install the package directly using:

pip install dpmlm

Optionally, you can install from source. In this repository, you will find a requirements.txt file, which contains all necessary Python dependencies.

Resource Bootstrapping

Before running the mechanism, you need to download the necessary NLTK libraries:

from dpmlm.utils import setup_resources

setup_resources()

Usage of DP-MLM

The core logic resides in the DPMLM class. You can now initialize it with custom calibration bounds to ensure the DP privatization is tuned to your specific model (and bounding strategy).

from dpmlm import DPMLM
from dpmlm.utils import calculate_logit_bounds, cleanup

# 1. (Optional) Calibrate bounds for your specific model (e.g., RoBERTa)
bounds = calculate_logit_bounds("FacebookAI/roberta-base")

# 2. Instantiate the mechanism
M = DPMLM(MODEL="FacebookAI/roberta-base", calibration=bounds, bound_strategy=None)

# 3. Rewrite text
private_text = M.dpmlm_rewrite("Hello world, this is a private text.", epsilon=25)

# 4. Cleanup (optional, when finished)
cleanup(model_instances=[M])

If you want to set a bounding strategy for the clip bounds (beyond simple min/max selection), you can do so by passing a lambda function:

# strategy as used in the paper
strategy = lambda mean, std, low, high: (mean, mean + 4*std)
M = DPMLM(MODEL="FacebookAI/roberta-base", calibration=bounds, bound_strategy=strategy)

DP-MLM Batched Mode

For longer documents, the batched mode provides significant performance increases by parallelizing masked token predictions on the GPU.

To use batching, simply run:

M.dpmlm_rewrite_batch("Large document text...", epsilon=25, batch_size=16)

Depending on your setup, you may need to tweak the batch_size parameter for the most optimal performance gains.

Input Document Length

As of the newest 2025 release, DP-MLM no longer has the shortcoming of the 512 token context window (256 with concatentation), which was due to the limitations of MLM context windows.

Now, DP-MLM operates with a sliding window, where the maximum context is given, centered around the target word to be privatized. Thus, DP-MLM now works on arbitrarily long documents!

Usage of other evaluated models

There is one other included file for replication of the paper, which is easily importable and reusable:

  • LLMDP.py: implementations of both DP-Paraphrase and DP-Prompt. Note that for DP-Prompt, you will need to download the corresponding LMs, i.e., from Hugging Face.

M = LLMDP.DPPrompt()

M.privatize("hello world", epsilon=100)

Important note

In order to use LLMDP.DPParaphrase, you must download the fine-tuned model directory. This can be found at the following link: Model

Citation

Please consider citing the original work that introduced DP-MLM. Thank you!

@inproceedings{meisenbacher-etal-2024-dp,
    title = "{DP}-{MLM}: Differentially Private Text Rewriting Using Masked Language Models",
    author = "Meisenbacher, Stephen  and
      Chevli, Maulik  and
      Vladika, Juraj  and
      Matthes, Florian",
    editor = "Ku, Lun-Wei  and
      Martins, Andre  and
      Srikumar, Vivek",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2024",
    month = aug,
    year = "2024",
    address = "Bangkok, Thailand",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2024.findings-acl.554/",
    doi = "10.18653/v1/2024.findings-acl.554",
    pages = "9314--9328"
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dpmlm-1.1.1.tar.gz (265.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dpmlm-1.1.1-py3-none-any.whl (265.7 kB view details)

Uploaded Python 3

File details

Details for the file dpmlm-1.1.1.tar.gz.

File metadata

  • Download URL: dpmlm-1.1.1.tar.gz
  • Upload date:
  • Size: 265.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for dpmlm-1.1.1.tar.gz
Algorithm Hash digest
SHA256 69a785047f3d2e3a7e32e13348afe6499387021febd8d239144004d0d8458923
MD5 4e906af35e2c9bf24d4669784b29163a
BLAKE2b-256 1e81af2b640119ba7e9d40c845864cdfb6e2ba1fa52dbf01553bf31eafe94cf9

See more details on using hashes here.

File details

Details for the file dpmlm-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: dpmlm-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 265.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for dpmlm-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 88e95718468763309f6398c4c546f30c99beb5bcd0fdb3fcb93a60d05af18cb1
MD5 3726b9742a4bb2700e9583a18df11bf6
BLAKE2b-256 6afd6d2802380513904b50e4e35dfd126c86d10c191e72b90452ff63b5548ffd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page