Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentences

09/24/2020
by   Boon Peng Yap, et al.
0

Domain adaptation or transfer learning using pre-trained language models such as BERT has proven to be an effective approach for many natural language processing tasks. In this work, we propose to formulate word sense disambiguation as a relevance ranking task, and fine-tune BERT on sequence-pair ranking task to select the most probable sense definition given a context sentence and a list of candidate sense definitions. We also introduce a data augmentation technique for WSD using existing example sentences from WordNet. Using the proposed training objective and data augmentation technique, our models are able to achieve state-of-the-art results on the English all-words benchmark datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset