Improving Expressivity of Graph Neural Networks

Stanisław Purgał

International Joint Conference on Neural Networks (IJCNN), pp. 1 – 7, 2020.


We propose a Graph Neural Network with greater expressive power than commonly used GNNs – not constrained to only differentiate between graphs that Weisfeiler-Lehman test recognizes to be non-isomorphic. We use a graph attention network with expanding attention window that aggregates information from nodes exponentially far away. We also use partially random initial embeddings, allowing differentiation between nodes that would otherwise look the same. This could cause problem with a traditional dropout mechanism, therefore we use a “head dropout”, randomly ignoring some attention heads rather than some dimensions of the embedding.


  PDF |    doi:10.1109/IJCNN48605.2020.9206591 | © Copyright IEEE


author = {Stanislaw J. Purgal},
title = {Improving Expressivity of Graph Neural Networks},
booktitle = {2020 International Joint Conference on Neural Networks, {IJCNN} 2020},
pages = {1--7},
publisher = {IEEE},
year = {2020},
url = {},
doi = {10.1109/IJCNN48605.2020.9206591},
Nach oben scrollen