Skip to main content

CNN Attention layer to be used with tf or tf.keras

Project description

Visual_attention_tf

GitHub license PyPI - Python Version PyPI PyPI - Wheel

A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras

Currently Implemented layers:

Installation

You can see the projects official pypi page : https://pypi.org/project/visual-attention-tf/

pip install visual-attention-tf

Use --no-dependencies if you have tensorflow-gpu installed already

Usage:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from visual_attention import PixelAttention2D , ChannelAttention2D

inp = Input(shape=(1920,1080,3))
cnn_layer = Conv2D(32,3,,activation='relu', padding='same')(inp)

# Using the .shape[-1] to simplify network modifications. Can directly input number of channels as well
Pixel_attention_cnn = PixelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Channel_attention_cnn = ChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

visual-attention-tf-1.1.0.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

visual_attention_tf-1.1.0-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file visual-attention-tf-1.1.0.tar.gz.

File metadata

  • Download URL: visual-attention-tf-1.1.0.tar.gz
  • Upload date:
  • Size: 3.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual-attention-tf-1.1.0.tar.gz
Algorithm Hash digest
SHA256 8fb0ccdb793f22d24ce06f844aa84d89a25e10ce1f11963f3dd0ee43493e96d8
MD5 80c2d517b348a940fca06315671220d5
BLAKE2b-256 feccf698555ed263470c497711168be17568916ec9c274e6df3496e27addd7a0

See more details on using hashes here.

File details

Details for the file visual_attention_tf-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: visual_attention_tf-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual_attention_tf-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 783a90a1c07255a4f843cd717bbbcce59899364254a442d91e48dc1b57e8f2ad
MD5 09bab381f7f704510be2bbe278084d7b
BLAKE2b-256 79daeddf4391344a368d7158164901eacd33dc7b2a58dc8bd36349fa78f4fa9d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page