Paper - Pytorch
Project description
MoE Mamba
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta.
Install
pip install moe-mamba
Usage
MoEMambaBlock
import torch
from moe_mamba import MoEMambaBlock
x = torch.randn(1, 10, 512)
model = MoEMambaBlock(
dim=512,
depth=6,
d_state=128,
expand=4,
num_experts=4,
)
out = model(x)
print(out)
Code Quality 🧹
make style
to format the codemake check_code_quality
to check code quality (PEP8 basically)black .
ruff . --fix
Citation
@misc{pióro2024moemamba,
title={MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts},
author={Maciej Pióro and Kamil Ciebiera and Krystian Król and Jan Ludziejewski and Sebastian Jaszczur},
year={2024},
eprint={2401.04081},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
moe_mamba-0.0.3.tar.gz
(5.6 kB
view hashes)
Built Distribution
Close
Hashes for moe_mamba-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 937d8d3cf2c65058f74761228b5879a4eb7c985b15e47a572a4048ddbbc7e913 |
|
MD5 | c347b19ec36cfe08a7aae63caac28804 |
|
BLAKE2b-256 | 550fc5c5ba01552ebfb72618a7d612f69602a51de4333b2f9b3b76927f61383d |