Transformers at zeta scales
Project description
Zeta - Seamlessly Create Zetascale Transformers
Create Ultra-Powerful Multi-Modality Models Seamlessly and Efficiently
Installation
To install:
pip install zetascale
To get hands-on and develop it locally:
git clone https://github.com/kyegomez/zeta.git
cd zeta
pip install -e .
Initiating Your Journey
Creating a model empowered with the aforementioned breakthrough research features is a breeze. Here's how to quickly materialize the renowned Flash Attention
import torch
from zeta import FlashAttention
q = torch.randn(2, 4, 6, 8)
k = torch.randn(2, 4, 10, 8)
v = torch.randn(2, 4, 10, 8)
attention = FlashAttention(causal=False, dropout=0.1, flash=True)
output = attention(q, k, v)
print(output.shape)
Acknowledgments
Zeta is a masterpiece inspired by LucidRains's repositories and elements of FairSeq and UniLM.
Citations
If our work here in Zeta has aided you in your journey, please consider acknowledging our efforts in your work. You can find relevant citation details in our Citations Document.
Contributing
We're dependent on you for contributions, it's only Kye maintaining this repository and it's very difficult and with that said any contribution is infinitely appreciated by not just me but by Zeta's users who dependen on this repository to build the world's best AI models
- Head over to the project board to look at open features to implement or bugs to tackle
Todo
- Head over to the project board to look at open features to implement or bugs to tackle
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for zetascale-0.3.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0efff461da29e33ca978e964b1f475be60f519a8317145a9aaa6fee6970a5cb9 |
|
MD5 | 13d56f27d0dc1998e4c4b9dbed613c62 |
|
BLAKE2b-256 | 8e9ac86cd8d343c7b129fb8141e37d7cd4cd3f24efeac7bfedc8dd51309e40a8 |