Skip to main content

General Base Layers for Graph Convolutions with tensorflow.keras

Project description

GitHub release (latest by date) Documentation Status PyPI version PyPI - Downloads kgcnn_unit_tests

Keras Graph Convolutions

A set of layers for graph convolutions in TensorFlow Keras that use RaggedTensors.

Table of Contents

General

The package in kgcnn contains several layer classes to build up graph convolution models. Some models are given as an example. A documentation is generated in docs. This repo is still under construction. Any comments, suggestions or help are very welcome!

Requirements

For kgcnn, usually the latest version of tensorflow is required, but is listed as extra requirements in the setup.py for simplicity. Additional python packages are placed in the setup.py requirements and are installed automatically.

  • tensorflow>=2.4.1

Installation

Clone repository https://github.com/aimat-lab/gcnn_keras and install with editable mode:

pip install -e ./gcnn_keras

or latest release via Python Package Index.

pip install kgcnn

Documentation

Auto-documentation is generated at https://kgcnn.readthedocs.io/en/latest/index.html .

Implementation details

Representation

The most frequent usage for graph convolutions is either node or graph classifaction. As for their size, either a single large graph, e.g. citation network or small (batched) graphs like molecules have to be considered. Graphs can be represented by an index list of connections plus feature information. Typical quantities in tensor format to describe a graph are listed below.

  • nodes: Nodelist of shape (batch, N, F) where N is the number of nodes and F is the node feature dimension.
  • edges: Edgelist of shape (batch, M, F) where M is the number of edges and F is the edge feature dimension.
  • indices: Connectionlist of shape (batch, M, 2) where M is the number of edges. The indices denote a connection of incoming i and outgoing j node as (i,j).
  • state: Graph state information of shape (batch, F) where F denotes the feature dimension.

A major issue for graphs is their flexible size and shape, when using mini-batches. Here, for a graph implementation in the spirit of keras, the batch dimension should be kept also in between layers. This is realized by using RaggedTensor.

Note: At the moment, most layers support also a disjoint representation of flatten values plus graph-id tensor [values, partition] in place of the RaggedTensor for comparison purposes. However, this will likely be removed in future versions, as RaggedTensor is intended be the only tensor representation passed to and within the model.

Input

In order to input batched tensors of variable length with keras, either zero-padding plus masking or ragged and sparse tensors can be used. Morover for more flexibility, a dataloader from tf.keras.utils.Sequence is often used to input disjoint graph representations. Tools for converting numpy or scipy arrays are found in utils.

Here, for ragged tensors, the nodelist of shape (batch, None, F) and edgelist of shape (batch, None, Fe) have one ragged dimension (None, ). The graph structure is represented by an index-list of shape (batch, None, 2) with index of incoming i and outgoing j node as (i,j). The first index of incoming node i is usually expected to be sorted for faster pooling operations, but can also be unsorted (see layer arguments). Furthermore, the graph is directed, so an additional edge with (j,i) is required for undirected graphs. A ragged constant can be directly obtained from a list of numpy arrays: tf.ragged.constant(indices,ragged_rank=1,inner_shape=(2,)) which yields shape (batch, None, 2).

Model

Models can be set up in a functional way. Example message passing from fundamental operations:

import tensorflow.keras as ks
from kgcnn.layers.gather import GatherNodes
from kgcnn.layers.keras import Dense, Concatenate  # ragged support
from kgcnn.layers.pooling import PoolingLocalMessages, PoolingNodes

n = ks.layers.Input(shape=(None, 3), name='node_input', dtype="float32", ragged=True)
ei = ks.layers.Input(shape=(None, 2), name='edge_index_input', dtype="int64", ragged=True)

n_in_out = GatherNodes()([n, ei])
node_messages = Dense(10, activation='relu')(n_in_out)
node_updates = PoolingLocalMessages()([n, node_messages, ei])
n_node_updates = Concatenate(axis=-1)([n, node_updates])
n_embedd = Dense(1)(n_node_updates)
g_embedd = PoolingNodes()(n_embedd)

message_passing = ks.models.Model(inputs=[n, ei], outputs=g_embedd)

Literature

A version of the following models are implemented in literature:

Datasets

In data there are simple data handling tools that are used for examples, which includes loading datasets.

Examples

A set of example traing can be found in example

Issues

Some known issues to be aware of, if using and making new models or layers with kgcnn.

  • RaggedTensor can not yet be used as a keras model output (https://github.com/tensorflow/tensorflow/issues/42320), which means only padded tensors can be used for batched node embedding tasks.
  • Using RaggedTensor's for arbitrary ragged rank apart from kgcnn.layers.keras can cause significant performance decrease.

Citing

If you want to cite this repo, refer to our preprint:

@article{REISER2021100095,
title = {Graph neural networks in TensorFlow-Keras with RaggedTensor representation (kgcnn)},
journal = {Software Impacts},
pages = {100095},
year = {2021},
issn = {2665-9638},
doi = {https://doi.org/10.1016/j.simpa.2021.100095},
url = {https://www.sciencedirect.com/science/article/pii/S266596382100035X},
author = {Patrick Reiser and Andre Eberhard and Pascal Friederich}
}

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kgcnn-1.0.1.tar.gz (79.1 kB view hashes)

Uploaded Source

Built Distribution

kgcnn-1.0.1-py3-none-any.whl (101.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page