CEAR: Cross-Entity Aware Reranker for Knowledge Base Completion

04/18/2021
by   Keshav Kolluru, et al.
12

Pre-trained language models (LMs) like BERT have shown to store factual knowledge about the world. This knowledge can be used to augment the information present in Knowledge Bases, which tend to be incomplete. However, prior attempts at using BERT for task of Knowledge Base Completion (KBC) resulted in performance worse than embedding based techniques that rely only on the graph structure. In this work we develop a novel model, Cross-Entity Aware Reranker (CEAR), that uses BERT to re-rank the output of existing KBC models with cross-entity attention. Unlike prior work that scores each entity independently, CEAR uses BERT to score the entities together, which is effective for exploiting its factual knowledge. CEAR establishes a new state of the art performance with 42.6 HITS@1 in FB15k-237 (32.7 and 5.3 pt improvement in HITS@1 for Open Link Prediction.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset