Skip to main content

Paper - Pytorch

Project description

Multi-Modality

Multi-Head Mixture of Experts (MHMoE)

MH-MoE to collectively attend to information from various representation spaces within different experts to deepen context understanding while significantly enhancing expert activation.

install

pip3 install mh-moe

usage

import torch
from mh_moe.main import MHMoE

# Define model parameters
dim = 512
heads = 8
num_experts = 4
num_layers = 3

# Create MHMoE model instance
model = MHMoE(dim, heads, num_experts, num_layers)

# Generate dummy input
batch_size = 10
seq_length = 20
dummy_input = torch.rand(batch_size, seq_length, dim)
dummy_mask = torch.ones(batch_size, seq_length)  # Example mask

# Forward pass through the model
output = model(dummy_input, dummy_mask)

# Print output and its shape
print(output)
print(output.shape)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mh_moe-0.0.2.tar.gz (5.1 kB view hashes)

Uploaded Source

Built Distribution

mh_moe-0.0.2-py3-none-any.whl (4.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page