Dependency-Aware Named Entity Recognition with Relative and Global Attentions

09/11/2019
by   Gustavo Aguilar, et al.
0

Named entity recognition is one of the core tasks in NLP. Although many improvements have been made on this task during the last years, the state-of-the-art systems do not explicitly take into account the recursive nature of language. Instead of only treating the text as a plain sequence of words, we incorporate a linguistically-inspired way to recognize entities based on syntax and tree structures. Our model exploits syntactic relationships among words using a Tree-LSTM guided by dependency trees. Then, we enhance these features by applying relative and global attention mechanisms. On the one hand, the relative attention detects the most informative words in the sentence with respect to the word being evaluated. On the other hand, the global attention spots the most relevant words in the sequence. Lastly, we linearly project the weighted vectors into the tagging space so that a conditional random field classifier predicts the entity labels. Our findings show that the model detects words that disclose the entity types based on their syntactic roles in a sentence (e.g., verbs such as speak and write are attended when the entity type is PERSON, whereas meet and travel strongly relate to LOCATION). We confirm our findings and establish a new state of the art on two datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset