Paper - Pytorch
Project description
Multi-Head Mixture of Experts (MHMoE)
MH-MoE to collectively attend to information from various representation spaces within different experts to deepen context understanding while significantly enhancing expert activation.
install
pip3 install mh-moe
usage
import torch
from mh_moe.main import MHMoE
# Define model parameters
dim = 512
heads = 8
num_experts = 4
num_layers = 3
# Create MHMoE model instance
model = MHMoE(dim, heads, num_experts, num_layers)
# Generate dummy input
batch_size = 10
seq_length = 20
dummy_input = torch.rand(batch_size, seq_length, dim)
dummy_mask = torch.ones(batch_size, seq_length) # Example mask
# Forward pass through the model
output = model(dummy_input, dummy_mask)
# Print output and its shape
print(output)
print(output.shape)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mh_moe-0.0.2.tar.gz
(5.1 kB
view details)
Built Distribution
File details
Details for the file mh_moe-0.0.2.tar.gz
.
File metadata
- Download URL: mh_moe-0.0.2.tar.gz
- Upload date:
- Size: 5.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2378d464f54c207ed129e57aa3b83ece6c3c7d675898a9bf26a1dc8d4c5afa94 |
|
MD5 | 504c72e206e37dd53c307649e302bd5e |
|
BLAKE2b-256 | c327b11a07721e0f2eedc06e955a5008b7261c52624016c41e74aaa0acb22a04 |
File details
Details for the file mh_moe-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: mh_moe-0.0.2-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b984be79496acf7cd3ab503ecb5c06b84f456a3f5599adae7558ea2a24ae35e7 |
|
MD5 | af965822c8e7695281e3f4d739af056b |
|
BLAKE2b-256 | ef11326c2c8ebb4ee326644183a3ba19b0fbf69cafa69e848fc6ccf9be47dfcb |