HyLa: Hyperbolic Laplacian Features For Graph Learning
Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data. For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks: e.g. hyperbolic graph convolutional networks (GCN) can outperform vanilla GCN. However, existing hyperbolic networks are computationally expensive and can be numerically unstable, and cannot scale to large graphs due to these shortcomings. In this paper, we propose HyLa, a completely different approach to using hyperbolic space in graph learning: HyLa maps once from a learned hyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperbolic space. Our method is inspired by the random Fourier feature methodology, which uses the eigenfunctions of the Laplacian in Euclidean space. We evaluate HyLa on downstream tasks including node classification and text classification, where HyLa shows significant improvements over hyperbolic GCN and other baselines.
READ FULL TEXT