Skip to main content

Multi-head attention implemented in PyTorch

Project description

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-multi-head-attention-0.15.1.tar.gz (3.6 kB view details)

Uploaded Source

File details

Details for the file torch-multi-head-attention-0.15.1.tar.gz.

File metadata

  • Download URL: torch-multi-head-attention-0.15.1.tar.gz
  • Upload date:
  • Size: 3.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/40.7.1 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.6.4

File hashes

Hashes for torch-multi-head-attention-0.15.1.tar.gz
Algorithm Hash digest
SHA256 e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056
MD5 e24c9e56e808eee69921d26768f8bcca
BLAKE2b-256 8d73b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page