FMS Acceleration Plugin for Attention and Distributed Packing Optimizations
Project description
FMS Acceleration for Attention And Distributed Packing Plugin
This library contains plugins to accelerate finetuning with the following optimizations:
- Padding-Free Flash Attention Computation
- Multipack Distributed Sampling
Plugins
| Plugin | Description | Depends | Loading | Augmentation | Callbacks |
|---|---|---|---|---|---|
| padding_free | Padding-Free Flash Attention Computation | flash_attn | ✅ | ||
| multipack sampler | Multipack Distributed Sampling | numba | ✅ |
Native Transformers Support from v4.44.0
Transformers natively supports padding-free from v4.44.0 see here. The padding-free plugin will use the transformers library if compatible,
otherwise if transformers < v4.44.0 the plugin will use an internal implementation instead.
Native TRL Support for PaddingFree with DataCollatorForCompletionOnlyLM from v0.10.1
Users will be able to use PaddingFree with untokenized data from TRL >= v0.10.1. The flattening of inputs and addition of position_ids to the batch
is carried out inside DataCollatorForCompletionOnlyLM when keyword padding_free is passed to the collator. The plugin uses the TRL library if compatible,
otherwise if trl < v0.10.1 the plugin will use an internal implementation instead.
If a user still passes in a pretokenized dataset, the plugin will still use DataCollaterForFlattening in the collate_fn.
Running Benchmarks
To reproduce the benchmarks, simply run the following commands,
Reproduce Padding Free on A100 80GB
tox -e run-benches -- "1 2" "4 8" benchmark_outputs scenarios-orca.yaml "none"
Reproduce MultiPack on A100 80GB
tox -e run-benches -- "2 4 8" "16 32 64" benchmark_outputs scenarios-orca.yaml "padding-free"
Known Issues
Currenly Only Supports Multipack with Padding-Free
The multipack plugin currently also requires the padding-free plugin to work. This may change in the future if there is demand for multipack to work standalone without padding free.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fms_acceleration_aadp-0.2.0-py3-none-any.whl.
File metadata
- Download URL: fms_acceleration_aadp-0.2.0-py3-none-any.whl
- Upload date:
- Size: 16.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.0.1 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
96285ca6a168cac90d70aea7b83677709aadd9fc165aeec753c05bab3ab295fa
|
|
| MD5 |
83fed34c27a74e0a7cb07066d509cfa7
|
|
| BLAKE2b-256 |
c7286f09277482e571193a2f1462e83648572ca01fa24ab7ef10021ffc39ac21
|
Provenance
The following attestation bundles were made for fms_acceleration_aadp-0.2.0-py3-none-any.whl:
Publisher:
build-and-publish.yml on foundation-model-stack/fms-acceleration
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fms_acceleration_aadp-0.2.0-py3-none-any.whl -
Subject digest:
96285ca6a168cac90d70aea7b83677709aadd9fc165aeec753c05bab3ab295fa - Sigstore transparency entry: 162770306
- Sigstore integration time:
-
Permalink:
foundation-model-stack/fms-acceleration@5b41478c7e09e792aab1df59260534ea4ffc6002 -
Branch / Tag:
refs/tags/v0.6.0 - Owner: https://github.com/foundation-model-stack
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
build-and-publish.yml@5b41478c7e09e792aab1df59260534ea4ffc6002 -
Trigger Event:
release
-
Statement type: