Skip to content

Latest commit

 

History

History
84 lines (75 loc) · 4.8 KB

usage_guide.md

File metadata and controls

84 lines (75 loc) · 4.8 KB

Usage guide

Calling the model

Building and calling the model is pretty simple

from rgat.layers import RGAT

inputs = get_inputs()                                # Dense tensor with shape (?, Features)

support = get_support()                              # Sparse tensor with dense shape (?, ?)
support = tf.sparse_reorder(support)                 # May be neccessary, depending on construction

rgat = RGAT(units=FLAGS.units, relations=RELATIONS)  # RELATIONS is an integer indicating the number 
                                                     # of relation types in the graph

outputs = rgat(inputs=inputs, support=support)       # Dense tensor with shape (?, FLAGS.units)

We provide implementations for both relational graph attention and relational graph convolution layers. They have a plethora of hyperparameters - check out their respective docstrings for the details or alternatively look here!

Preparing supports and inputs

Single graph

The simplest case is where we have a single graph to work on, as in the AIFB and MUTAG example. In this case, our features and supports tensor represents the features and support structure of a single graph. The features are still straightfoward.

inputs = get_inputs()                                      # Dense array with shape (Nodes, Features)

The support on the other hand should be created from constructing an OrderdedDict whose keys are the names of the edge type, and values are corresponding scipy sparse matrices of dense shape (Nodes, Nodes)

support_dict = get_supports_dict()                         # Ordered dictionary   
                                                           # [('rel_1': spmatrix(...)), ...]

The support on the other hand should be created from constructing an OrderdedDict whose keys are the names of the edge type, and values are corresponding scipy sparse matrices of dense shape (Nodes, Nodes)

support_dict = get_supports()                              # Ordered dictionary   
                                                           # [('rel_1': spmatrix(...)), ...]

The input into the layer is a single features tensor and a single support tensor. To combine the support_dict correctly, we provide the helper function

from rgat.utils import graph_utils as gu

support = gu.relational_supports_to_support(support_dict)  # Sparse tensor of dense shape
                                                           # (Nodes, Relations * Nodes) 

These arrays and sparse tensors can then form the basis for feeding placeholders or constructing TensorFlow datasets. To feed a tf.sparse.placeholder we also provide the helper function

support_triple = gu.triple_from_coo(support)               # Triple of indices, values and dense shape

which can then be used in

support_ph = tf.sparse_placeholder(...)
feed_dict = {support_ph: support_triple, ...}

Don't forget to tf.sparse_reorder before feeding the support to the layer! For a concrete example see the batching example.

Multiple graphs

A more complicated scenario is where we have more than one graph to work with (for example, in a molecular property prediction task). The features are still straightforward

inputs = get_inputs()                                      # List of dense arrays, each 
                                                           # with shape (?, Features)
inputs = np.concatenate(inputs, axis=0)                    # Dense array with shape 
                                                           # (Total nodes, Features)

where total nodes is the number of nodes across all graphs in the input. The support is generated from a list of OrderdedDicts - one for each batch element

list_of_support_dicts = get_supports()                     # List of ordered dictionaries   
                                                           # [[('rel_1': spmatrix(...)), ...], ...]

To combine the list_of_support_dicts correctly, we provide the helper function

support = gu.batch_of_relational_supports_to_support(      # Sparse tensor of dense shape
    support_dict)                                          # (Total nodes, Relations * Total nodes)

Using these inputs and support you then proceed as in the single graph case. For a concrete example see the batching example.