CNN Attention layer to be used with tf or tf.keras
Reason this release was yanked:
Import pathing issues
Project description
Visual_attention_tf
A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras
Currently Implemented layers:
- Pixel Attention : Efficient Image Super-Resolution Using Pixel Attention(Hengyuan Zhao et al)
- Channel Attention : CBAM: Convolutional Block Attention Module(Sanghyun Woo et al)
Usage:
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for visual-attention-tf-1.0.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 59e477c5f1cfad1f73ca949e9b2f399e0d0cd619d66c896e7d1bb280987abdd6 |
|
MD5 | f5de7ded2439d3f7b230616c72ea3063 |
|
BLAKE2b-256 | b1aaff6fd3b6a07c1fc7c8b89d31d6be688bc24e0217b6293eeb067419b02c51 |
Close
Hashes for visual_attention_tf-1.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 59f60a0ba18c7fa0b6c7bbee9a491f9f10569c1556f0743c1a1f91024505aafd |
|
MD5 | 9cd0d5353bc77db247011cd2be94d14e |
|
BLAKE2b-256 | 4b2b5b7ac013d99174a9f025a22cca757fef280d03c2886f40f0522a64a1a3c6 |