Mincut pooling in Graph Neural Networks

06/30/2019
by   Filippo Maria Bianchi, et al.
0

The advance of node pooling operations in a Graph Neural Network (GNN) has lagged behind the feverish design of new graph convolution techniques, and pooling remains an important and challenging endeavor for the design of deep architectures. In this paper, we propose a pooling operation for GNNs that implements a differentiable unsupervised loss based on the mincut optimization objective. First, we validate the effectiveness of the proposed loss function by clustering nodes in citation networks and through visualization examples, such as image segmentation. Then, we show how the proposed pooling layer can be used to build a deep GNN architecture for graph classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset