zamba - Pytorch
Project description
Zamba
Implementation of Zamba, the joint mamba-transformer model!, It's now fully ready to train! PAPER LINK
Install
pip3 install zamba-torch
Usage
import torch # Importing the torch library for deep learning operations
from zamba_torch.main import (
Zamba,
) # Importing the ZambaBlock class from the zamba.main module
# # Example usage
x = torch.randint(
0, 256, (1, 512)
) # Generating a random tensor of shape (1, 512, 512
model = Zamba(
dim=512, # Setting the dimension of the model to 512
heads=8, # Setting the number of attention heads to 8
dim_head=64, # Setting the dimension of each attention head to 64
d_state=512, # Setting the state dimension to 512
dt_rank=128, # Setting the rank of the temporal kernel to 128
d_conv=256, # Setting the dimension of the convolutional layer to 256
vocab_size=256, # Setting the size of the vocabulary to 256
max_seq_len=512, # Setting the maximum sequence length to 512
)
print(
model(x)
) # Printing the output of the model when applied to the input tensor
License
MIT
Citation
@misc{glorioso2024zamba,
title={Zamba: A Compact 7B SSM Hybrid Model},
author={Paolo Glorioso and Quentin Anthony and Yury Tokpanov and James Whittington and Jonathan Pilault and Adam Ibrahim and Beren Millidge},
year={2024},
eprint={2405.16712},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
zamba_torch-0.0.4.tar.gz
(5.4 kB
view hashes)
Built Distribution
Close
Hashes for zamba_torch-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6bd61568a8725fb3c5003edeaf9199b4182abb9dd54756c4c9cb33896399ec2c |
|
MD5 | 5c4c8ab91eee7ff15fb4fe1b829a9953 |
|
BLAKE2b-256 | c68d519930bc287d929f3c212b3381f1583ce4442e7601431a3308a7575f51e3 |