Skip to main content

BPE modification that implements removing intermediate tokens during tokenizer training.

Project description

Picky BPE

This repository contains a prototype code for the paper "BPE Gets Picky: Efficient Vocabulary Refinement During Tokenizer Training", which was presented at EMNLP 2024.

[ACL Anthology] [arXiv] [BibTeX]

Training

For training you should use bpe_trainer.py script. For example, the following command trains a Picky BPE tokenizer with vocabulary size 8192 and IoS threshold of 0.9.

$ python bpe_trainer.py --input_file train.txt --model_file model.json --vocab_size 8192 --threshold 0.9

The complete list of options is:

Args:
  --input_file     Path to the training corpus
  --model_file     Path to save the model
  --vocab_size     Desired vocabulary size
  --threshold      Desired IoS threshold
  --coverage       Relative symbol coverage for the initial vocabulary (default: 0.9999)
  --pad_id         PAD token id (default: 0)
  --unk_id         UNK token id (default: 1)
  --bos_id         BOS token id (default: 2)
  --eos_id         EOS token id (default: 3)
  --logging_step   Frequency of merges logging (default: 200)

Tokenization

To apply the trained Picky BPE model, use the picky_tokenize.py script. For example:

$ python picky_tokenize.py --bpe_model model.json --input_file train.txt --output_file train.tok.txt

The complete list of options is:

Args:
  --model_file    Path to the trained model
  --input_file    Path to the raw corpus
  --output_file   Path to save the tokenized corpus
  --return_type   Whether to output tokens ("str") or ids ("int") (default: "str")

Referencing

To cite PickyBPE:

@inproceedings{chizhov-etal-2024-bpe,
    title = "{BPE} Gets Picky: Efficient Vocabulary Refinement During Tokenizer Training",
    author = "Chizhov, Pavel  and
      Arnett, Catherine  and
      Korotkova, Elizaveta  and
      Yamshchikov, Ivan P.",
    editor = "Al-Onaizan, Yaser  and
      Bansal, Mohit  and
      Chen, Yun-Nung",
    booktitle = "Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2024",
    address = "Miami, Florida, USA",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2024.emnlp-main.925",
    pages = "16587--16604",
    abstract = "Language models can greatly benefit from efficient tokenization. However, they still mostly utilize the classical Byte-Pair Encoding (BPE) algorithm, a simple and reliable method. BPE has been shown to cause such issues as under-trained tokens and sub-optimal compression that may affect the downstream performance. We introduce PickyBPE, a modified BPE algorithm that carries out vocabulary refinement during tokenizer training by removing merges that leave intermediate {``}junk{''} tokens. Our method improves vocabulary efficiency, eliminates under-trained tokens, and does not compromise text compression. Our experiments show that this method either improves downstream performance or does not harm it.",
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pickybpe_bauwenst-1.0.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pickybpe_bauwenst-1.0.0-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file pickybpe_bauwenst-1.0.0.tar.gz.

File metadata

  • Download URL: pickybpe_bauwenst-1.0.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.12 HTTPX/0.28.1

File hashes

Hashes for pickybpe_bauwenst-1.0.0.tar.gz
Algorithm Hash digest
SHA256 fa056895e82620c5254c49824be5032771a23a57355da221aabf1bd1705887ac
MD5 3d40be0c9eae9fa152ab69ae556b0656
BLAKE2b-256 4180805fad3ae5cc5d394d759732b729c9e6ac0418604f502d49e327981136ed

See more details on using hashes here.

File details

Details for the file pickybpe_bauwenst-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: pickybpe_bauwenst-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.12 HTTPX/0.28.1

File hashes

Hashes for pickybpe_bauwenst-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d1f87f3872ff6869324ed143f13b0933b23136e9994bd71252440841d1a8c8ff
MD5 e6c8938aff4b70ea44cf0bda2a57d1da
BLAKE2b-256 85421434a9c75a0ac41ffbc6e9ec9e4c4de44cfc1283c30682242208f914f54b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page