Skip to main content

Paper - Pytorch

Project description

Multi-Modality

Multi-Head Mixture of Experts (MHMoE)

MH-MoE to collectively attend to information from various representation spaces within different experts to deepen context understanding while significantly enhancing expert activation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mh_moe-0.0.1.tar.gz (4.9 kB view details)

Uploaded Source

Built Distribution

mh_moe-0.0.1-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file mh_moe-0.0.1.tar.gz.

File metadata

  • Download URL: mh_moe-0.0.1.tar.gz
  • Upload date:
  • Size: 4.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0

File hashes

Hashes for mh_moe-0.0.1.tar.gz
Algorithm Hash digest
SHA256 b5917f905a4b22ca1d10aaaff0ecb69f55c3078a393ed4cba2aa995395801292
MD5 a2f26419ddc41fcf5772a5a5df9113f8
BLAKE2b-256 6e97402647bd001a51fa1fc34b3f56893456b95a272f427f83c87793816b1544

See more details on using hashes here.

File details

Details for the file mh_moe-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: mh_moe-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/23.3.0

File hashes

Hashes for mh_moe-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4fbbd669a88b911fddf9e85c45fcb803d93c9ea6ebfe3e45a13406b294ae5e05
MD5 b09f8563f17eeae6292933ebd08ebfed
BLAKE2b-256 8c6c7ab63e7b4741a172d280025d2b8a94bd3fdcb57b6c01b998c18b83877e97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page