Skip to main content

Paper - Pytorch

Project description

Multi-Modality

MambaFormer

Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"

install

pip3 install mamba-former

usage

import torch 
from mamba_former.main import MambaFormer

# Forward pass example
x = torch.randint(1, 1000, (1, 100)) # Token
# Tokens are integrers

# Model
model = MambaFormer(
    dim = 512,
    num_tokens = 1000,
    depth = 6,
    d_state = 512,
    d_conv = 128,
    heads = 8,
    dim_head = 64,
    return_tokens = True
)

# Forward
out = model(x)
print(out)
print(out.shape)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mamba_former-0.0.1.tar.gz (3.6 kB view hashes)

Uploaded Source

Built Distribution

mamba_former-0.0.1-py3-none-any.whl (3.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page