Skip to main content
Join the official 2019 Python Developers SurveyStart the survey!

Multi-head attention implemented in PyTorch

Project description

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for torch-multi-head-attention, version 0.15.1
Filename, size File type Python version Upload date Hashes
Filename, size torch-multi-head-attention-0.15.1.tar.gz (3.6 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page