Skip to main content

MLoRAx is a minimalist library for low-rank adaptation designd to effortlessly enable parameter-efficient training for Transformer-based models.

Project description

Installation

Choice 1: Install with pip

You can install mlorax with pip. This is the recommended way to use mlorax if you want to receive future updates.

pip install mlorax

Choice 2: Just copy the code

You can also directly copy the code from mlorax.py and paste it into your project. This is the easiest way to use mlorax if you do not care about future updates.

Choice 3: Install from source

You can also install mlorax from source. You only need to do this if you want to contribute to the project.

git clone https://github.com/yongchanghao/MLoRAx.git
cd MLoRAx
pip install -e .

Usage

It is extremely easy to use mlorax to convert any Flax model to a LoRA model. The following code snippet shows how to convert a T5 model to a LoRA model based on HuggingFace's FlaxT5ForConditionalGeneration class.

+ import mlorax
  model = FlaxT5ForConditionalGeneration.from_pretrained('t5-small')
- params = model.params
- apply_fn = model.__call__
+ lora_spec = mlorax.LoRASpec(rank=16, rules=['Attention.q', 'Attention.v'])
+ params, apply_fn, merge_fn = mlorax.lora_init(lora_spec, model)
  state = TrainState(apply_fn=apply_fn, params=params, tx=tx, **kwargs)

That's it! You can now train the model as usual.

Principles

Always use the returned apply_fn for model forwarding if possible. Otherwise use params=merge_fn(params) to pass the merged parameters to the function. For example, if you want to use model.generate for text generation, you can do the following:

- outputs = model.generate(**batch, params=params)
+ outputs = model.generate(**batch, params=merge_fn(params))

Example and Results

Please refer to the examples folder for details.

Citation

If you find MLoRAx useful, please cite the following paper:

@software{hao2023mlorax,
  author = {Yongchang Hao},
  title = {{ML}o{RA}x: a minimalist library for low-rank adaptation for {T}ransformer-based models},
  year = {2023},
  url = {https://github.com/yongchanghao/MLoRAx},
  version = {0.9.2}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlorax-0.9.2.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

mlorax-0.9.2-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file mlorax-0.9.2.tar.gz.

File metadata

  • Download URL: mlorax-0.9.2.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for mlorax-0.9.2.tar.gz
Algorithm Hash digest
SHA256 033afbe677f8009a015fb54a5affd8382ae82e40cb12b64b256a70757a40ca2c
MD5 5831c910cefc9d2b21a66a31ff20a84f
BLAKE2b-256 a11bd18030429946d99fbf55c6f5902c3266668dbab12838c59cb6412b5c0aae

See more details on using hashes here.

File details

Details for the file mlorax-0.9.2-py3-none-any.whl.

File metadata

  • Download URL: mlorax-0.9.2-py3-none-any.whl
  • Upload date:
  • Size: 8.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for mlorax-0.9.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ddf52d47ae33d228836a6c5e12c96ecc48de4d885173083baa0d569b18524442
MD5 899d91a8eaa43c099465f505de4661fa
BLAKE2b-256 b276b0ae549c58958808814a4ebe4b681c0351767764a6baaee5f705040b1586

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page