Skip to main content

CNN Attention layer to be used with tf or tf.keras

Project description

Visual_attention_tf

GitHub license PyPI - Python Version PyPI PyPI - Wheel

A set of image attention layers implemented as custom keras layers that can be imported dirctly into keras

Currently Implemented layers:

Installation

You can see the projects official pypi page : https://pypi.org/project/visual-attention-tf/

pip install visual-attention-tf

Usage:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D
from visual_attention import PixelAttention2D , ChannelAttention2D

inp = Input(shape=(1920,1080,3))
cnn_layer = Conv2D(32,3,,activation='relu', padding='same')(inp)

# Using the .shape[-1] to simplify network modifications. Can directly input number of channels as well
Pixel_attention_cnn = PixelAttention2D(cnn_layer.shape[-1])(cnn_layer)
Channel_attention_cnn = ChannelAttention2D(cnn_layer.shape[-1])(cnn_layer)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

visual-attention-tf-1.0.4.tar.gz (3.0 kB view details)

Uploaded Source

Built Distribution

visual_attention_tf-1.0.4-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file visual-attention-tf-1.0.4.tar.gz.

File metadata

  • Download URL: visual-attention-tf-1.0.4.tar.gz
  • Upload date:
  • Size: 3.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual-attention-tf-1.0.4.tar.gz
Algorithm Hash digest
SHA256 09ca099e8f9541ae5cd0dd0c49aa01e2f7e5343e31a077dad73aae99850448ad
MD5 8d67ce9cac2e692418230e79633d7ac6
BLAKE2b-256 f6adf7366017603e0b5514c6b20f3ae8c82912a455b2ecd7c9f052d6874585e1

See more details on using hashes here.

File details

Details for the file visual_attention_tf-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: visual_attention_tf-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.6.12

File hashes

Hashes for visual_attention_tf-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 90573eb69e2c289065d601ba1d4259032a4205d6f8d752e62a7e68a47aef7033
MD5 fdb49aada7860a12d1261dea3514b000
BLAKE2b-256 1559f8f759e214b6b0dc50a52ee49bd41220243bcade809d72c75ce2f734710e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page