Helpful tools and examples for working with flex-attention
Project description
Attention Gym
Attention Gym is a collection of helpful tools and examples for working with flex-attention
🎯 Features | 🚀 Getting Started | 💻 Usage | 🛠️ Dev | 🤝 Contributing | ⚖️ License
📖 Overview
This repository aims to provide a playground for experimenting with various attention mechanisms using the FlexAttention API. It includes implementations of different attention variants, performance comparisons, and utility functions to help researchers and developers explore and optimize attention mechanisms in their models.
🎯 Features
- Implementations of various attention mechanisms using FlexAttention
- Utility functions for creating and combining attention masks
- Examples of how to use FlexAttention in real-world scenarios
🚀 Getting Started
Prerequisites
- PyTorch (version 2.5 or higher)
Installation
git clone https://github.com/pytorch-labs/attention-gym.git
cd attention-gym
pip install .
💻 Usage
There are two main ways to use Attention Gym:
-
Run Example Scripts: Many files in the project can be executed directly to demonstrate their functionality:
python attn_gym/masks/document_mask.py
These scripts often generate visualizations to help you understand the attention mechanisms.
-
Import in Your Projects: You can use Attention Gym components in your own work by importing them:
from torch.nn.attention.flex_attention import flex_attention, create_block_mask from attn_gym.masks import generate_sliding_window # Use the imported function in your code sliding_window_mask = generate_sliding_window(window_size=1024) block_mask = create_block_mask(mask_mod, 1, 1, S, S, device=device) out = flex_attention(query, key, value, block_mask=block_mask)
For comprehensive examples of using FlexAttention in real-world scenarios, explore the examples/
directory. These end-to-end implementations showcase how to integrate various attention mechanisms into your models.
Note
Attention Gym is under active development, and we do not currently offer any backward compatibility guarantees. APIs and functionalities may change between versions. We recommend pinning to a specific version in your projects and carefully reviewing changes when upgrading.
📁 Structure
Attention Gym is organized for easy exploration of attention mechanisms:
🔍 Key Locations
attn_gym.masks
: Examples creatingBlockMasks
attn_gym.mods
: Examples creatingscore_mods
examples/
: Detailed implementations using FlexAttention
🛠️ Dev
Install dev requirements
pip install -e ".[dev]"
Install pre-commit hooks
pre-commit install
🤝 Contributing
We welcome contributions to Attention Gym, especially new Masks or score mods! Here's how you can contribute:
Contributing Mods
- Create a new file in the attn_gym/masks/ for mask_mods or attn_gym/mods/ for score_mods.
- Implement your function, and add a simple main function that showcases your new function.
- Update the
attn_gym/*/__init__.py
file to include your new function. - [Optinally] Add an end to end example using your new func in the examples/ directory.
See CONTRIBUTING.md for more details.
⚖️ License
attention-gym is released under the BSD 3-Clause License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file attn_gym-0.0.1.tar.gz
.
File metadata
- Download URL: attn_gym-0.0.1.tar.gz
- Upload date:
- Size: 24.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.0 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | eb059dcc49f3108ee9dc3b89ffe4f17aade8214d3397280ba7be8bd13e68bea4 |
|
MD5 | d7842b776cd8e49af2c8c1c34815b42f |
|
BLAKE2b-256 | bc8b5a89d6d0e04927e0bbc16957436551625387dd013596831ed03863a2b314 |
File details
Details for the file attn_gym-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: attn_gym-0.0.1-py3-none-any.whl
- Upload date:
- Size: 13.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.0 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5b1aecb6fcb6e2391d86cd50e2b1ab88e993c55f651224b01544dd6b2ef805bd |
|
MD5 | f87183c5c7d5c801bc81ac826ce238c3 |
|
BLAKE2b-256 | 0deca1a71935da106d0c5d12b123892860f6e65eba7d5fca00ff48379c68ade9 |