Exponential Separations in Symmetric Neural Networks

06/02/2022
by   Aaron Zweig, et al.
0

In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network santoro2017simple architecture as a natural generalization of the DeepSets zaheer2017deep architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset