Skip to main content

No project description provided

Project description

AttentionSmithy

The Attention Is All You Need paper completely revolutionized the AI industry. After inspiring such programs like GPT and BERT, it seems all deep learning research began exclusively focusing on the attention mechanism behind transformers. This has created a great deal of research surrounding the topic, spawning hundreds of variations to the original paper meant to enhance the original program or tailor it to new applications. Most of these developments happen in isolation, disconnected from the broader community and incompatible with tools made by other developers. For developers that want to experiment with combining these ideas to fit a new problem, such a disjointed state is frustrating.

AttentionSmithy was designed as a platform that allows for flexible experimentation with the attention mechanism in a variety of applications. This includes the ability to use a multitude of positional embeddings, variations on the attention mechanism, and others.

The baseline code was originally inspired by The Annotated Transformer blog code. We have created examples of transformer models in the following repositories:

Future Directions


🤝 Join the conversation! 🤝

As you read and have ideas, please go to the Discussions tab of this repository and share them with us. We have ideas for future extensions and applications, and would love your input.


Setting Up the Python Environment

To ensure compatibility, use Python 3.9 or greater. You can set up the environment using either Conda or a virtual environment with venv.

Using Conda

  1. Create a new Conda environment:
    conda create --name as_env python
    
  2. Activate the environment:
    conda activate as_env
    

Using venv

  1. Create a virtual environment:
    python -m venv as_env
    
  2. Activate the environment:
    • On macOS/Linux:
      source as_env/bin/activate
      
    • On Windows:
      as_env\Scripts\activate
      

Installation

You can install attention-smithy using either of the following methods:

Install from PyPI

The simplest way to install attention-smithy is via pip:

pip install attention-smithy

Install from GitHub

Alternatively, you can install the latest version directly from the GitHub repository:

git clone https://github.com/xomicsdatascience/AttentionSmithy.git
cd AttentionSmithy
pip install -e .

This method is recommended if you want to work with the latest source code or contribute to the project.

AttentionSmithy Components

Here is a visual depiction of the different components of a transformer model, using Figure 1 from Attention Is All You Need as reference.

Screenshot 2025-02-10 at 3 53 42 PM

AttentionSmithy Numeric Embedding

Here is a visual depiction of where each positional or numeric embedding fits in to the original model. We have implemented 4 popular strategies (sinusoidal, learned, rotary, ALiBi), but would like to expand to more in the future.

Screenshot 2025-02-10 at 3 49 43 PM

AttentionSmithy Attention Methods

Here is a basic visual of possible attention mechanisms AttentionSmithy has been designed to incorporate in future development efforts. The provided examples include Longformer attention and Big Bird attention.

Screenshot 2025-02-10 at 3 45 58 PM

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

attention_smithy-1.2.0.tar.gz (50.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

attention_smithy-1.2.0-py3-none-any.whl (62.4 kB view details)

Uploaded Python 3

File details

Details for the file attention_smithy-1.2.0.tar.gz.

File metadata

  • Download URL: attention_smithy-1.2.0.tar.gz
  • Upload date:
  • Size: 50.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.10

File hashes

Hashes for attention_smithy-1.2.0.tar.gz
Algorithm Hash digest
SHA256 9a6ec23ff2c783cd6175c96657bde96cd236b792590c4730e167f511ba1f2adb
MD5 66fd43913b034ec7c72c8427937ee832
BLAKE2b-256 b33341517981cf3b10d1e651d979082cdf270856c5c411d81314624925041756

See more details on using hashes here.

File details

Details for the file attention_smithy-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for attention_smithy-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 36e608af7dfaa816211cc0f12b9f9669afec9b84d612e89cc62ab80aaa3f7982
MD5 e4d61c11b8eddd78b0732ad712600340
BLAKE2b-256 f1ce55c49ec4894e8b910d06b8cb5b76a4520cc924679b00a5134ec368020d65

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page