Span-based Joint Entity and Relation Extraction with Transformer Pre-training

09/17/2019
by   Markus Eberts, et al.
0

We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our approach employs the pre-trained Transformer network BERT as its core. We use BERT embeddings as shared inputs for a light-weight reasoning, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained on strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 5 entity and relation extraction.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset