Skip to main content

Implementation on PyTorch of Self-attention Does Not Need $O(n^2)$ Memory

Project description

faster-transformer

Self-attention Does Not Need $O(n^2)$ MemoryのPytorch実装

@misc{rabe2021selfattention,
    title={Self-attention Does Not Need $O(n^2)$ Memory}, 
    author={Markus N. Rabe and Charles Staats},
    year={2021},
    eprint={2112.05682},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page