Conference Paper Published - IJCNN 2022 [Details]

A collabarative work with KTH Royal Institute, Swedan on Graph Convolution Nwural Network have been accepted and published in IJCNN-2022 conference.


Abstract:

In this paper, we propose a novel way of representing graphs for processing in Graph Neural Networks. We reduce the dimensionality of the input data by using Random Indexing, a Vector Symbolic Architectural framework; we implement a new trainable neural layer, also inspired by Vector Symbolic Architectures; we leverage the sparseness of the incoming data in a Sparse Neural Network framework. Our experiments on a number of publicly available datasets and standard benchmarks demonstrate that we can reduce the number of parameters by up to two orders of magnitude. We show how this parsimonious approach not only delivers competitive results but even improves performance for node classification and link prediction. We find that this holds in particular for cases where the graph lacks node features.