Skip to main content

jamba - Pytorch

Project description

Multi-Modality

Jamba

PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"

install

$ pip install jamba

usage

import torch 
from jamba.model import JambaBlock

# Create a random tensor of shape (1, 128, 512)
x = torch.randn(1, 128, 512)

# Create an instance of the JambaBlock class
jamba = JambaBlock(
    512,  # input channels
    128,  # hidden channels
    128,  # key channels
    8,    # number of heads
    4,    # number of layers
)

# Pass the input tensor through the JambaBlock
output = jamba(x)

# Print the shape of the output tensor
print(output.shape)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jamba-0.0.1.tar.gz (7.2 kB view hashes)

Uploaded Source

Built Distribution

jamba-0.0.1-py3-none-any.whl (7.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page