A Graph Attention Framework for extracting Graph Attention embeddings and implementing Multihead Graph Attention Networks
Project description
This package is used for extracting Graph Attention Embeddings and provides a framework for a Tensorflow Graph Attention Layer which can be used for knowledge graph /node base semantic tasks. It determines the pair wise embedding matrix for a higher order node representation and concatenates them with an attention weight. It then passes it through a leakyrelu activation for importance sampling and damps out negative effect of a node.It then applies a softmax layer for normalization of the attention results and determines the final output scores.The GraphAttentionBase.py script implements a Tensorflow/Keras Layer for the GAT which can be used and the GraphMultiheadAttention.py is used to extract GAT embeddings.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file GraphAttentionNetworks-0.1.tar.gz
.
File metadata
- Download URL: GraphAttentionNetworks-0.1.tar.gz
- Upload date:
- Size: 4.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.5.0.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.8.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fc349b7e036c984523a3dc77c4222c0287825e9aadf5b8dce5279cfac336dc7b |
|
MD5 | 868dd3850c5a846acc5c0e5ff49515c7 |
|
BLAKE2b-256 | eceb1fa7998ac6d0cc21c555771a9acf1ea2f1db06fcb6b351d8b9be2585f404 |