CNN Attention layer to be used with tf or tf.keras
Reason this release was yanked:
sd
Project description
Visual_attention_tf
A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras
Currently Implemented layers:
- Pixel Attention : Efficient Image Super-Resolution Using Pixel Attention(Hengyuan Zhao et al)
- Channel Attention : CBAM: Convolutional Block Attention Module(Sanghyun Woo et al)
Usage:
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D, Conv2DTranspose, SeparableConv2D, Concatenate, Multiply, Add
from
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for visual-attention-tf-1.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8a7faa2195664788c010a8464ab0623eaa932b056afffdcbd5b7282dae2cd24b |
|
MD5 | 16a9e1a67186428d1d4494d5d5042e4d |
|
BLAKE2b-256 | 0a52f998f9cf698cc067eca81e19609c2095b82d26cb441704b8e892fba3cde8 |
Close
Hashes for visual_attention_tf-1.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9840725074f4b0064aaaafc65cc6e2df9e8adf7d997e971d75d60a3e701bd847 |
|
MD5 | 85b3f02a12a1b4847d51908e53c6aa8e |
|
BLAKE2b-256 | 589b21442857668688c533586a3ca458b38a7ae5e98e0c8fd69ad650002cb470 |