No project description provided
Project description
AttentionSmithy
The Attention Is All You Need paper completely revolutionized the AI industry. After inspiring such programs like GPT and BERT, it seems all deep learning research began exclusively focusing on the attention mechanism behind transformers. This has created a great deal of research surrounding the topic, spawning hundreds of variations to the original paper meant to enhance the original program or tailor it to new applications. Most of these developments happen in isolation, disconnected from the broader community and incompatible with tools made by other developers. For developers that want to experiment with combining these ideas to fit a new problem, such a disjointed state is frustrating.
AttentionSmithy was designed as a platform that allows for flexible experimentation with the attention mechanism in a variety of applications. This includes the ability to use a multitude of positional embeddings, variations on the attention mechanism, and others.
The baseline code was originally inspired by The Annotated Transformer blog code. We have created examples of transformer models in the following repositories:
- machine translation, including an neural architecture search (NAS) setup
- geneformer
Future Directions
🤝 Join the conversation! 🤝
As you read and have ideas, please go to the Discussions tab of this repository and share them with us. We have ideas for future extensions and applications, and would love your input.
Setting Up the Python Environment
To ensure compatibility, use Python 3.9 or greater. You can set up the environment using either Conda or a virtual environment with venv
.
Using Conda
- Create a new Conda environment:
conda create --name as_env python
- Activate the environment:
conda activate as_env
Using venv
- Create a virtual environment:
python -m venv as_env
- Activate the environment:
- On macOS/Linux:
source as_env/bin/activate
- On Windows:
as_env\Scripts\activate
- On macOS/Linux:
Installation
You can install attention-smithy
using either of the following methods:
Install from PyPI
The simplest way to install attention-smithy
is via pip
:
pip install attention-smithy
Install from GitHub
Alternatively, you can install the latest version directly from the GitHub repository:
git clone https://github.com/xomicsdatascience/AttentionSmithy.git
cd AttentionSmithy
pip install -e .
This method is recommended if you want to work with the latest source code or contribute to the project.
AttentionSmithy Components
Here is a visual depiction of the different components of a transformer model, using Figure 1 from Attention Is All You Need as reference.
AttentionSmithy Numeric Embedding
Here is a visual depiction of where each positional or numeric embedding fits in to the original model. We have implemented 4 popular strategies (sinusoidal, learned, rotary, ALiBi), but would like to expand to more in the future.
AttentionSmithy Attention Methods
Here is a basic visual of possible attention mechanisms AttentionSmithy has been designed to incorporate in future development efforts. The provided examples include Longformer attention and Big Bird attention.