Simple Mambda - Pytorch
Project description
Simple Mamba
Install
pip install simple-mamba
Usage
import torch
from simple_mamba import MambaBlock
# Define block parameters
dim = 512
hidden_dim = 128
heads = 8
in_channels = 3
out_channels = 3
kernel_size = 3
# Create an instance of MambaBlock
mamba_block = MambaBlock(
dim, hidden_dim, heads, in_channels, out_channels, kernel_size
)
# Create a sample input tensor
x = torch.randn(1, dim, dim)
# Pass the tensor through the MambaBlock
output = mamba_block(x)
print("Output shape:", output.shape)
License
MIT
Citation
@misc{gu2023mamba,
title={Mamba: Linear-Time Sequence Modeling with Selective State Spaces},
author={Albert Gu and Tri Dao},
year={2023},
eprint={2312.00752},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
simple_mamba-0.0.3.tar.gz
(4.2 kB
view hashes)
Built Distribution
Close
Hashes for simple_mamba-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bde9b0911ad667789c0a372aa6a8232fbf34d5f2504b6bbda368df00953e087b |
|
MD5 | 172f3b13449d0e59edbe0c898910afb9 |
|
BLAKE2b-256 | 54316931d0e70e2953b0cb7a7080a66afcf74fe51cbadf7a6558fa35d54cca36 |