Semi-Supervised Classification with Graph Convolutional Networks, Thomas N. Kipf and Max Welling, 2017International Conference on Learning Representations (ICLR)DOI: 10.48550/arXiv.1609.02907 - This foundational paper introduces Graph Convolutional Networks (GCNs), which use a specific update function combining separate linear transformations and summation before a non-linearity. This approach is explicitly mentioned as an alternative in the section and illustrates how non-linearities are integrated into GNN updates.
Inductive Representation Learning on Large Graphs, William L. Hamilton, Rex Ying, and Jure Leskovec, 2017Advances in Neural Information Processing Systems (NIPS)DOI: 10.48550/arXiv.1706.02216 - This paper introduces GraphSAGE, a prominent GNN architecture that employs update mechanisms similar to the general concatenation approach described in the section, often combining aggregated messages with self-features and applying non-linear transformations.
Graph Representation Learning, William L. Hamilton, 2020Synthesis Lectures on Artificial Intelligence and Machine Learning, Vol. 14 (Morgan & Claypool Publishers)DOI: 10.2200/S01045ED1V01Y202009AIM046 - This textbook provides a comprehensive theoretical and practical treatment of GNNs, including detailed explanations of message passing paradigms, aggregation and update functions, and the role of non-linearities in different GNN architectures.
Graph Neural Networks: A Review of Methods and Applications, Jie Zhou, Ganqu Cui, Zhengyan Li, Mingxuan Wang, Lei Wang, 2021AI Open, Vol. 2 (Elsevier)DOI: 10.1016/j.aiopen.2021.01.001 - This survey paper categorizes and reviews various GNN models, discussing their message passing mechanisms, including different aggregation and update function designs, and the importance of non-linear transformations in building expressive GNN layers.