Paper - Pytorch
Project description
Simba
A simpler Pytorch + Zeta Implementation of the paper: "SiMBA: Simplified Mamba-based Architecture for Vision and Multivariate Time series"
install
$ pip install simba-torch
usage
import torch
from simba_torch.main import Simba
# Forward pass with images
img = torch.randn(1, 3, 224, 224)
# Create model
model = Simba(
dim = 4, # Dimension of the transformer
dropout = 0.1, # Dropout rate for regularization
d_state=64, # Dimension of the transformer state
d_conv=64, # Dimension of the convolutional layers
num_classes=64, # Number of output classes
depth=8, # Number of transformer layers
patch_size=16, # Size of the image patches
image_size=224, # Size of the input image
channels=3, # Number of input channels
# use_pos_emb=True # If you want
)
# Forward pass
out = model(img)
print(out.shape)
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
Close
Hashes for simba_torch-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9fb4e45e96bec7f69a3ae0b121052b9936136ff85aaae45d8e5eac00fb02c3f6 |
|
MD5 | 8bd1f12d17ce16f17d67841b275c4cbc |
|
BLAKE2b-256 | 08999dc2b91317b4cb9d62efc97b8392c1e2d8684b49869a03c7924f2043f1af |