Invariant Layers for Graphs with Nodes of Different Types

02/27/2023
by   Dmitry Rybin, et al.
0

Neural networks that satisfy invariance with respect to input permutations have been widely studied in machine learning literature. However, in many applications, only a subset of all input permutations is of interest. For heterogeneous graph data, one can focus on permutations that preserve node types. We fully characterize linear layers invariant to such permutations. We verify experimentally that implementing these layers in graph neural network architectures allows learning important node interactions more effectively than existing techniques. We show that the dimension of space of these layers is given by a generalization of Bell numbers, extending the work (Maron et al., 2019). We further narrow the invariant network design space by addressing a question about the sizes of tensor layers necessary for function approximation on graph data. Our findings suggest that function approximation on a graph with n nodes can be done with tensors of sizes ≤ n, which is tighter than the best-known bound ≤ n(n-1)/2. For d × d image data with translation symmetry, our methods give a tight upper bound 2d - 1 (instead of d^4) on sizes of invariant tensor generators via a surprising connection to Davenport constants.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset