Skip to main content

CNN Attention layer to be used with tf or tf.keras

Project description

Visual_attention_tf

GitHub license PyPI - Python Version PyPI PyPI - Wheel

A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras

Currently Implemented layers:

Installation

You can see the projects official pypi page : https://pypi.org/project/visual-attention-tf/

pip install visual-attention-tf

Use --no-dependencies if you have tensorflow-gpu installed already

Usage:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from visual_attention import PixelAttention2D , ChannelAttention2D,EfficientChannelAttention2D

inp = Input(shape=(1920,1080,3))
cnn_layer = Conv2D(32,3,,activation='relu', padding='same')(inp)

# Using the .shape[-1] to simplify network modifications. Can directly input number of channels as well
Pixel_attention_cnn = PixelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Channel_attention_cnn = ChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)
EfficientChannelAttention_cnn = EfficientChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

visual-attention-tf-1.2.0.tar.gz (3.3 kB view details)

Uploaded Source

Built Distribution

visual_attention_tf-1.2.0-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file visual-attention-tf-1.2.0.tar.gz.

File metadata

  • Download URL: visual-attention-tf-1.2.0.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual-attention-tf-1.2.0.tar.gz
Algorithm Hash digest
SHA256 2595b764ec1dcbf39a8fc2b981e756c795b9b334fedaf07e34cf58350957da01
MD5 26af7ae7cc4c7613474b664e5a72ce20
BLAKE2b-256 f940f5329f8fca302b6499f21a671b10fcd8f8f4e5a9f84913bc4de785ad840c

See more details on using hashes here.

File details

Details for the file visual_attention_tf-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: visual_attention_tf-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 5.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual_attention_tf-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e212a1170bcbb4216ced73c87dfefa1e6c72ab0205952ace59ccef23ba38db36
MD5 71cb257214056b4c5b120c3a306fa174
BLAKE2b-256 9c5d9f038133f7ae7a76a928a7a587f63b3e5823c0c5950725330c974dfc1a44

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page