jamba - Pytorch
Project description
Jamba
PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"
install
$ pip install jamba
usage
import torch
from jamba.model import JambaBlock
# Create a random tensor of shape (1, 128, 512)
x = torch.randn(1, 128, 512)
# Create an instance of the JambaBlock class
jamba = JambaBlock(
512, # input channels
128, # hidden channels
128, # key channels
8, # number of heads
4, # number of layers
)
# Pass the input tensor through the JambaBlock
output = jamba(x)
# Print the shape of the output tensor
print(output.shape)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
jamba-0.0.1.tar.gz
(7.2 kB
view details)
Built Distribution
jamba-0.0.1-py3-none-any.whl
(7.3 kB
view details)
File details
Details for the file jamba-0.0.1.tar.gz
.
File metadata
- Download URL: jamba-0.0.1.tar.gz
- Upload date:
- Size: 7.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ce3fc09a23aa708d5da4df6ca9331393fa00f9093f97a4fb98e742f94d2e2999 |
|
MD5 | 956f9c6b87a86f2df2cd14074d45a79e |
|
BLAKE2b-256 | 02127bae47c6f4daa9db3da0b09a51774752a97e2e037ff5185a1a276482cb65 |
File details
Details for the file jamba-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: jamba-0.0.1-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8fddb3514e5eac6223c9dd0580884b1dd046680729f558fca5995a92bc65950d |
|
MD5 | 559bb94b1e7b2a562e67b1439b84cbab |
|
BLAKE2b-256 | c9a2e874fbf07fa495ab583900082af23540e7b8e290c284185425c8c7f236c4 |