Skip to main content

Multi-head attention implemented in PyTorch

Project description

PyTorch Multi-Head Attention

Travis Coverage


pip install torch-multi-head-attention


from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for torch-multi-head-attention, version 0.15.1
Filename, size File type Python version Upload date Hashes
Filename, size torch-multi-head-attention-0.15.1.tar.gz (3.6 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page