Skip to content

Latest commit

 

History

History
49 lines (36 loc) · 1.72 KB

README.md

File metadata and controls

49 lines (36 loc) · 1.72 KB

TF-GNN Models

Introduction

This directory contains a collection of GNN models implemented with the TF-GNN library. Some of them offer reusable pieces that can be imported next to the core TF-GNN library, which effectively makes them little libraries of their own.

Usage

If, for example, the hypothetical FancyNet model offered a graph update layer, its use would look like

import tensorflow_gnn as tfgnn
from tensorflow_gnn.models import fancynet

graph = fancynet.FancyGraphUpdate(units=42, fanciness=0.99, ...)(graph)

...and require a separate dependency for fancynet in a BUILD file.

API stability

Each model comes with a README file that describes its intended level of API stability. Not all models are covered by the semantic versioning of the TF-GNN package.

List of Models

  • Contrastive Losses: Contrastive losses for self-supervised learning.
  • GATv2: Graph Attention Networks v2 (Brody&al, 2021).
  • GCN: Graph Convolutional Networks (Kipf&Welling, 2016), for homogeneous graphs only.
  • GraphSAGE (Hamilton&al., 2017).
  • MtAlbis: Model Template "Albis" for easy configuration of a few field-tested GNN architectures, generalizing VanillaMPNN.
  • MultiHeadAttention: Transformer-style multi-head attention on graph (Dwivedi&Bresson, 2021).
  • VanillaMPNN: TF-GNN's classic baseline model, based on (Gilmer&al., 2016).

Unsure? For generic node prediction tasks on relational data, we recommend to start with MtAlbis.