Skip to content

Latest commit

 

History

History
53 lines (39 loc) · 2.22 KB

README.md

File metadata and controls

53 lines (39 loc) · 2.22 KB

Vanilla MPNN (TF-GNN's classic model flavor)

New users, please see MtAlbis instead.* It is a more powerful superset of this model.

Overview

Message Passing Neural Networks (MPNNs) are a general formulation of Graph Neural Networks, originally published for the case of homogeneous graphs by

TF-GNN supports all sorts of Message Passing Neural Networks (MPNNs) on heterogeneous graphs through its tfgnn.keras.layers.GraphUpdate and tfgnn.keras.layers.NodeUpdate classes. For those, users need to specify a convolution for each edge set (to compute messages on edges and their aggregation to nodes) as well as a next-state layer for each node set. (The TF-GNN Modeling Guide has the details.)

This code here provides a ready-to-use, straightforward realization of MPNNs on heterogeneous graphs in which the messages on an edge set and the next states of a node set are each computed from a single-layer neural network on the concatenation of all relevant inputs.

Reduced to the homogeneous case, this recovers Interaction Networks, originally published by

Gilmer&al. (loc.cit.) discuss them as "pair message" MPNNs, when using both endpoint states and the edge feature for the message. The authors of this code found them to be a quite powerful baseline.

Usage

TensorFlow programs can import and use this model as described in its API docs.

API stability

This model is covered by semantic versioning of TensorFlow GNN's open-source releases: new minor versions do not break existing users.