Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Universal Local Attractors on Graphs

Version 1 : Received: 18 April 2024 / Approved: 19 April 2024 / Online: 19 April 2024 (11:43:13 CEST)

How to cite: Krasanakis, E.; Papadopoulos, S.; Kompatsiaris, I. Universal Local Attractors on Graphs. Preprints 2024, 2024041340. https://doi.org/10.20944/preprints202404.1340.v1 Krasanakis, E.; Papadopoulos, S.; Kompatsiaris, I. Universal Local Attractors on Graphs. Preprints 2024, 2024041340. https://doi.org/10.20944/preprints202404.1340.v1

Abstract

Being able to express broad families of equivariant or invariant attributed graph functions is a popular measuring stick of whether graph neural networks should be employed in practical applications. However, it is equally important to learn deep local minima of losses (i.e., with much smaller loss values than other minima), even when architectures cannot express global minima. In this work we introduce the architectural property of attracting GNN optimization trajectories to local minima as a means of achieving smaller losses. We take first steps in satisfying this property for losses defined over attributed undirected unweighted graphs with a novel architecture, called Universal Local Attractor (ULA). The latter refines each dimension of end-to-end trained node feature embeddings based on graph structure to track the optimization trajectories of losses satisfying some mild conditions. The refined dimensions are then linearly pooled to create predictions. We experiment on 10 tasks, from node classification to clique detection, on which ULA is comparable with or significantly outperforms popular alternatives of similar or greater theoretical expressive power.

Keywords

Graph Neural Networks; Universal Approximation; Local Attractors; Diffusion; Attributed Graphs

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.