Neural Message Passing for Quantum Chemistry, Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, George E. Dahl, 2017Proceedings of the 34th International Conference on Machine Learning (ICML), Vol. 70 (PMLR (Proceedings of Machine Learning Research))DOI: 10.5555/3305890.3306020 - Introduces the general "Message Passing Neural Network" (MPNN) framework, which formalizes the aggregate-update mechanism for various graph neural network variants.
Inductive Representation Learning on Large Graphs, William L. Hamilton, Rex Ying, Jure Leskovec, 2017Advances in Neural Information Processing Systems, Vol. 30 (NeurIPS)DOI: 10.5555/3295222.3295267 - Introduces GraphSAGE, explicitly exploring different neighborhood aggregation functions (mean, LSTM, pooling) and demonstrating the inductive capabilities of the message passing framework.
Relational inductive biases, deep learning, and graph networks, Peter W. Battaglia, Jessica B. Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinicius Zambaldi, Mateusz Malinowski, Andrea Tacchetti, David Raposo, Adam Santoro, Ryan Faulkner, Caglar Gulcehre, Francis Song, Andrew Ballard, Justin Gilmer, George Dahl, Ashish Vaswani, Kelsey Allen, Charles Nash, Victoria Langston, Chris Dyer, Nicolas Heess, Daan Wierstra, Pushmeet Kohli, Matt Botvinick, Oriol Vinyals, Yujia Li, Razvan Pascanu, 2018arXiv preprint arXiv:1806.01261DOI: 10.48550/arXiv.1806.01261 - Presents a unified conceptual framework for "Graph Networks" that generalizes many graph neural network approaches, including the message passing mechanism, by focusing on object, relation, and global attributes.