Paper - Pytorch
Project description
GeidiPrime
This is an extremely experimental Transformer architecture with Macaron like FFNs with local attention. Perhap's we can add the visual expert from Zeta and make it multi-modal!
Install
Usage
import torch
from geidi_prime.model import GeidiPrimeTransformer
model = GeidiPrimeTransformer(
dim=4096,
depth=6,
heads=8,
num_tokens=20000,
)
x = torch.randint(0, 20000, (1, 4096))
out = model(x)
print(out.shape)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
geidiprime-0.0.1.tar.gz
(3.9 kB
view hashes)
Built Distribution
Close
Hashes for geidiprime-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a71e972687727f5c544e8fd7a88c48cba4fa69f297040d4e64043b1dc57c0df |
|
MD5 | c1ce47a00a657655b6c22ab718a920ff |
|
BLAKE2b-256 | ff07c467733196302abf72c910fff56e5cf7927fbad9da34dd7aa432f31e7146 |