Skip to main content

A tool to reduce the size of Hugging Face models via vocabulary trimming.

Project description

hf-trim

Python HuggingFace PyTorch

PyPI GitHub tag (latest by date) PyPI - License

A package to reduce the size of 🤗 Hugging Face models via vocabulary trimming.

The library currently supports the following models (and their pretrained versions available on the Hugging Face Models hub);

  1. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation
  2. mBART: Multilingual Denoising Pre-training for Neural Machine Translation
  3. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
  4. mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer

"Why would I need to trim the vocabulary on a model?" 🤔

To put it simply, vocabulary trimming is a way to reduce a language model's memory footprint while retaining most of its performance.

Read more here.

Citation

If you use this software, please cite it as given below;

@software{Srivastava_hf-trim,
author = {Srivastava, Aditya},
license = {MPL-2.0},
title = {{hf-trim}}
url = {https://github.com/IamAdiSri/hf-trim}
}

Installation

You can also run the following command to install from PyPI;

$ pip install hf-trim

You can install from source;

$ git clone https://github.com/IamAdiSri/hf-trim
$ cd hf-trim
$ pip install .

Usage

Simple Example

from transformers import MT5Config, MT5Tokenizer, MT5ForConditionalGeneration
from hftrim.TokenizerTrimmer import TokenizerTrimmer
from hftrim.ModelTrimmers import MT5Trimmer

data = [
        " UN Chief Says There Is No Military Solution in Syria", 
        "Şeful ONU declară că nu există o soluţie militară în Siria"
]

# load pretrained config, tokenizer and model
config = MT5Config.from_pretrained("google/mt5-small")
tokenizer = MT5Tokenizer.from_pretrained("google/mt5-small")
model = MT5ForConditionalGeneration.from_pretrained("google/mt5-small")

# trim tokenizer
tt = TokenizerTrimmer(tokenizer)
tt.make_vocab(data)
tt.make_tokenizer()

# trim model
mt = MT5Trimmer(model, config, tt.trimmed_tokenizer)
mt.make_weights(tt.trimmed_vocab_ids)
mt.make_model()

You can directly use the trimmed model with mt.trimmed_model and the trimmed tokenizer with tt.trimmed_tokenizer.

Saving and Loading

# save with
tt.trimmed_tokenizer.save_pretrained('trimT5')
mt.trimmed_model.save_pretrained('trimT5')

# load with
config = MT5Config.from_pretrained("trimT5")
tokenizer = MT5Tokenizer.from_pretrained("trimT5")
model = MT5ForConditionalGeneration.from_pretrained("trimT5")

Limitations

  • Fast tokenizers are currently unsupported.
  • Tensorflow and Flax models are currently unsupported.

Roadmap

  • Add support for MarianMT models.
  • Add support for FSMT models.

Issues

Feel free to open an issue if you run into bugs, have any queries or want to request support for an architecture.

Contributing

Contributions are welcome, especially those adding functionality for new or currently unsupported models.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hf-trim-3.0.1.tar.gz (11.9 kB view details)

Uploaded Source

Built Distribution

hf_trim-3.0.1-py3-none-any.whl (14.4 kB view details)

Uploaded Python 3

File details

Details for the file hf-trim-3.0.1.tar.gz.

File metadata

  • Download URL: hf-trim-3.0.1.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for hf-trim-3.0.1.tar.gz
Algorithm Hash digest
SHA256 ac9b1030322f2ad1ed38fbfbe6e73e8f61bcf13f1a1cef5f1b15a8884d3cb240
MD5 5c59144cdf8db716a1e8274da80c7876
BLAKE2b-256 82c8702177e6a766c032583386faf7bc862b0fbc48670f0be0f0425aa19d1ad8

See more details on using hashes here.

File details

Details for the file hf_trim-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: hf_trim-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 14.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for hf_trim-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 418cdb94a75d575bdedd7498705c6b76131df97216dee6abc26aec61e1902a9b
MD5 6368f2d1e975c36441fca60e520fcc9a
BLAKE2b-256 c0de669b527912bf87935f0eaa8c42a6d20b7a4b8c3d83cee31bc6862c8796a0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page