Skip to main content

Pytorch implementation of popular Attention Mechanisms, Vision Transformers, MLP-Like models and CNNs.

Project description

image

This codebase is a PyTorch implementation of various attention mechanisms, CNNs, Vision Transformers and MLP-Like models.

If it is helpful for your work, please⭐

Updating...

Attention mechanisms

  • Squeeze-and-Excitation Networks (CVPR 2018) pdf
  • CBAM: convolutional block attention module (ECCV 2018) pdf
  • Bam: Bottleneck attention module(BMVC 2018) pdf
  • A2-nets: Double attention networks (NeurIPS 2018) pdf
  • Srm : A style-based recalibration module for convolutional neural networks (ICCV 2019) pdf
  • Gcnet: Non-local networks meet squeeze-excitation networks and beyond (ICCVW 2019) pdf
  • Selective Kernel Networks (CVPR 2019) pdf
  • Linear Context Transform Block (AAAI 2020) pdf
  • Gated Channel Transformation for Visual Recognition (CVPR 2020) pdf
  • Ecanet: Efficient channel attention for deep convolutional neural networks (CVPR 2020) pdf
  • Rotate to Attend: Convolutional Triplet Attention Module (WACV 2021) pdf
  • Gaussian Context Transformer (CVPR 2021) pdf
  • Coordinate Attention for Efficient Mobile Network Design (CVPR 2021) pdf
  • SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks (ICML 2021) pdf

Vision Transformers

  • An image is worth 16x16 words: Transformers for image recognition at scale (ICLR 2021) pdf
  • XCiT: Cross-Covariance Image Transformer (NeurIPS 2021) pdf
  • Rethinking Spatial Dimensions of Vision Transformers (ICCV 2021) pdf
  • CvT: Introducing Convolutions to Vision Transformers (ICCV 2021) pdf
  • CMT: Convolutional Neural Networks Meet Vision Transformers (CVPR 2022) pdf
  • MetaFormer is Actually What You Need for Vision (CVPR 2022) pdf
  • MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer (ICLR 2022) pdf
  • DilateFormer: Multi-Scale Dilated Transformer for Visual Recognition (TMM 2023) pdf
  • BViT: Broad Attention based Vision Transformer (TNNLS 2023) pdf

Convolutional Neural Networks(CNNs)

  • Network In Network (ICLR 2014) pdf
  • Deep Residual Learning for Image Recognition (CVPR 2016) pdf
  • Wide Residual Networks (BMVC 2016) pdf
  • Densely Connected Convolutional Networks (CVPR 2017) pdf
  • Deep Pyramidal Residual Networks (CVPR 2017) pdf
  • MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications (CVPR 2017) pdf
  • MobileNetV2: Inverted Residuals and Linear Bottlenecks (CVPR 2018) pdf
  • Searching for MobileNetV3 (ICCV 2019) pdf
  • Res2Net: A New Multi-scale Backbone Architecture (TPAMI 2019) pdf
  • GhostNet: More Features from Cheap Operations (CVPR 2020) pdf
  • A ConvNet for the 2020s (CVPR 2022) pdf

MLP-Like Models

  • MLP-Mixer: An all-MLP Architecture for Vision (NeurIPS 2021) pdf
  • Pay Attention to MLPs (NeurIPS 2021) pdf
  • Global Filter Networks for Image Classification (NeurIPS 2021) pdf
  • Sparse MLP for Image Recognition: Is Self-Attention Really Necessary? (AAAI 2022) pdf
  • DynaMixer: A Vision MLP Architecture with Dynamic Mixing (ICML 2022) pdf
  • Patches Are All You Need? (TMLR 2022) pdf
  • Vision Permutator: A Permutable MLP-Like Architecture for Visual Recognition (TPAMI 2022) pdf
  • CycleMLP: A MLP-like Architecture for Dense Prediction (ICLR 2022) pdf

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-attention-1.0.0.tar.gz (3.3 kB view details)

Uploaded Source

File details

Details for the file pytorch-attention-1.0.0.tar.gz.

File metadata

  • Download URL: pytorch-attention-1.0.0.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.7

File hashes

Hashes for pytorch-attention-1.0.0.tar.gz
Algorithm Hash digest
SHA256 cf74e56fbe57c0e93f84a61819c670395cb4f7f025fb4142b9ef9ba765417d86
MD5 2d8bde10ae5f63017bae85e2a40543bb
BLAKE2b-256 5bc7a20d2f16a97925a1951571bda607ac5fb9cd4fbbed01a47a8a498629e803

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page