Improving Distant Supervision with Maxpooled Attention and Sentence-Level Supervision

10/30/2018
by   Iz Beltagy, et al.
0

We propose an effective multitask learning setup for reducing distant supervision noise by leveraging sentence-level supervision. We show how sentence-level supervision can be used to improve the encoding of individual sentences, and to learn which input sentences are more likely to express the relationship between a pair of entities. We also introduce a novel neural architecture for collecting signals from multiple input sentences, which combines the benefits of attention and maxpooling. The proposed method increases AUC by 10 results on the FB-NYT dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset