CNN Attention layer to be used with tf or tf.keras
Project description
Visual_attention_tf
A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras
Currently Implemented layers:
- Pixel Attention : Efficient Image Super-Resolution Using Pixel Attention(Hengyuan Zhao et al)
- Channel Attention : CBAM: Convolutional Block Attention Module(Sanghyun Woo et al)
- Efficient Channel Attention : ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
Installation
You can see the projects official pypi page : https://pypi.org/project/visual-attention-tf/
pip install visual-attention-tf
Use --no-dependencies if you have tensorflow-gpu installed already
Usage:
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from visual_attention import PixelAttention2D , ChannelAttention2D,EfficientChannelAttention2D
inp = Input(shape=(1920,1080,3))
cnn_layer = Conv2D(32,3,,activation='relu', padding='same')(inp)
# Using the .shape[-1] to simplify network modifications. Can directly input number of channels as well
Pixel_attention_cnn = PixelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Channel_attention_cnn = ChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)
EfficientChannelAttention_cnn = EfficientChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for visual-attention-tf-1.2.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2595b764ec1dcbf39a8fc2b981e756c795b9b334fedaf07e34cf58350957da01 |
|
MD5 | 26af7ae7cc4c7613474b664e5a72ce20 |
|
BLAKE2b-256 | f940f5329f8fca302b6499f21a671b10fcd8f8f4e5a9f84913bc4de785ad840c |
Close
Hashes for visual_attention_tf-1.2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e212a1170bcbb4216ced73c87dfefa1e6c72ab0205952ace59ccef23ba38db36 |
|
MD5 | 71cb257214056b4c5b120c3a306fa174 |
|
BLAKE2b-256 | 9c5d9f038133f7ae7a76a928a7a587f63b3e5823c0c5950725330c974dfc1a44 |