New users, please see MtAlbis instead.* It is a more powerful superset of this model.
Message Passing Neural Networks (MPNNs) are a general formulation of Graph Neural Networks, originally published for the case of homogeneous graphs by
- J. Gilmer, S.S. Schoenholz, P.F. Riley, O. Vinyals, G.E. Dahl: "Neural message passing for Quantum chemistry", ICML 2017.
TF-GNN supports all sorts of Message Passing Neural Networks (MPNNs) on
heterogeneous graphs through its tfgnn.keras.layers.GraphUpdate
and
tfgnn.keras.layers.NodeUpdate
classes. For those, users need to specify a
convolution for each edge set (to compute messages on edges and their
aggregation to nodes) as well as a next-state layer for each node set.
(The TF-GNN Modeling
Guide
has the details.)
This code here provides a ready-to-use, straightforward realization of MPNNs on heterogeneous graphs in which the messages on an edge set and the next states of a node set are each computed from a single-layer neural network on the concatenation of all relevant inputs.
Reduced to the homogeneous case, this recovers Interaction Networks, originally published by
- P. Battaglia, R. Pascanu, M. Lai, D. Jimenez Rezende, K. Kavukcuoglu: "Interaction Networks for Learning about Objects, Relations and Physics", NIPS 2016.
Gilmer&al. (loc.cit.) discuss them as "pair message" MPNNs, when using both endpoint states and the edge feature for the message. The authors of this code found them to be a quite powerful baseline.
TensorFlow programs can import and use this model as described in its API docs.
This model is covered by semantic versioning of TensorFlow GNN's open-source releases: new minor versions do not break existing users.