Bridging the Knowledge Gap: Enhancing Question Answering with World and Domain Knowledge

10/16/2019
by   Travis R. Goodwin, et al.
0

In this paper we present OSCAR (Ontology-based Semantic Composition Augmented Regularization), a method for injecting task-agnostic knowledge from an Ontology or knowledge graph into a neural network during pretraining. We evaluated the impact of including OSCAR when pretraining BERT with Wikipedia articles by measuring the performance when fine-tuning on two question answering tasks involving world knowledge and causal reasoning and one requiring domain (healthcare) knowledge and obtained 33:3 improved accuracy compared to pretraining BERT without OSCAR and obtaining new state-of-the-art results on two of the tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset