Transformers at zeta scales
Project description
Zeta - Seamlessly Create Zetascale Transformers
Create Ultra-Powerful Multi-Modality Models Seamlessly and Efficiently in as minimal lines of code as possible.
🤝 Schedule a 1-on-1 Session
Book a 1-on-1 Session with Kye, the Creator, to discuss any issues, provide feedback, or explore how we can improve Zeta for you.
Installation
To install:
pip install zetascale
To get hands-on and develop it locally:
git clone https://github.com/kyegomez/zeta.git
cd zeta
pip install -e .
Initiating Your Journey
Creating a model empowered with the aforementioned breakthrough research features is a breeze. Here's how to quickly materialize the renowned Flash Attention
import torch
from zeta import FlashAttention
q = torch.randn(2, 4, 6, 8)
k = torch.randn(2, 4, 10, 8)
v = torch.randn(2, 4, 10, 8)
attention = FlashAttention(causal=False, dropout=0.1, flash=True)
output = attention(q, k, v)
print(output.shape)
Documentation
Click here for the documentation, it's at zeta.apac.ai
Vision
Zeta hopes to be the leading framework and library to effortlessly enable you to create the most capable and reliable foundation models out there with infinite scalability.
Acknowledgments
Zeta is a masterpiece inspired by LucidRains's repositories and elements of FairSeq and UniLM.
Contributing
We're dependent on you for contributions, it's only Kye maintaining this repository and it's very difficult and with that said any contribution is infinitely appreciated by not just me but by Zeta's users who dependen on this repository to build the world's best AI models
- Head over to the project board to look at open features to implement or bugs to tackle
Todo
- Head over to the project board to look at open features to implement or bugs to tackle
Project Board
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for zetascale-0.5.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f4fbf7af85606d64127d1eeab8bd98f7c7034988100679d7883bcc63aacedd1f |
|
MD5 | 5174dc4006329aabd87e1a74bf3466c0 |
|
BLAKE2b-256 | 4d7d9cd794c090b51b8fd434de73bbb0cbd0415ed8ae481ecf15c47b5ed1885e |