Breaking the Limits of Message Passing Graph Neural Networks. Balcilar, M., Héroux, P., Gaüzère, B., Vasseur, P., Adam, S., & Honeine, P. In The 38th International Conference on Machine Learning (ICML), Vienna, Austria, 18 - 24~July, 2021.
Breaking the Limits of Message Passing Graph Neural Networks [pdf]Paper  Breaking the Limits of Message Passing Graph Neural Networks [link]Link  abstract   bibtex   
Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs, they have been widely implemented and still raise a lot of interest even though their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph convolution supports are designed in spectral-domain by a non-linear custom function of eigenvalues and masked with an arbitrary large receptive field, the MPNN is theoretically more powerful than the 1-WL test, experimentally as powerful as a 3-WL existing model and is spatially localized. Moreover, by designing custom filter functions, outputs can have various frequency components that allow the convolution process to learn different relationships between given input graph signals and their associated properties. So far, the best 3-WL equivalent graph neural networks have a computational complexity in $\mathcal{O}(n^3)$ with memory usage in $\mathcal{O}(n^2)$, consider non-local update mechanism and do not provide the spectral richness of output profile. The proposed method overcomes all these aforementioned problems and reaches state-of-the-art results in many downstream tasks.
@INPROCEEDINGS{21.icml.gnn,
   title={Breaking the Limits of Message Passing Graph Neural Networks},
   author={Muhammet Balcilar and Pierre H{\'e}roux and Benoit Ga{\"u}z{\`e}re and Pascal Vasseur and S{\'e}bastien Adam and Paul Honeine},
   booktitle={The 38th International Conference on Machine Learning (ICML)},
  address =  "Vienna, Austria",
   year  =  "2021",
   month =  "18 - 24~" # jul,
   acronym =  "ICML",
   url_paper  =  "http://honeine.fr/paul/publi/21.icml.gnn.pdf",
   url_link = "https://icml.cc/Conferences/2021",
   abstract = "Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs, they have been widely implemented and still raise a lot of interest even though their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph convolution supports are designed in spectral-domain by a non-linear custom function of eigenvalues and masked with an arbitrary large receptive field, the MPNN is theoretically more powerful than the 1-WL test, experimentally as powerful as a 3-WL existing model and is spatially localized. Moreover, by designing custom filter functions, outputs can have various frequency components that allow the convolution process to learn different relationships between given input graph signals and their associated properties.
So far, the best 3-WL equivalent graph neural networks have a computational complexity in $\mathcal{O}(n^3)$ with memory usage in $\mathcal{O}(n^2)$, consider non-local update mechanism and do not provide the spectral richness of output profile. The proposed method overcomes all these aforementioned problems and reaches state-of-the-art results in many downstream tasks.",
}



%   url_code  =  "...",
%   url_presentation  =  "",

%Paper accepted at #ICML2021, "Breaking the Limits of Message Passing Graph Neural Networks", by @balcilar_m, @PierreHeroux, @BGauzere, Pascal Vasseur, @GrandSebAdam and @PaulHoneine, @LitisLab
Downloads: 0