Learning In-context Learning for Named Entity Recognition
Named entity recognition in real-world applications suffers from the diversity of entity types, the emergence of new entity types, and the lack of high-quality annotations. To address the above problems, this paper proposes an in-context learning-based NER approach, which can effectively inject in-context NER ability into PLMs and recognize entities of novel types on-the-fly using only a few demonstrative instances. Specifically, we model PLMs as a meta-function Ξ»_ πΎππππππΈππΎβ΄π, πΉβ―πβ΄πππππΆππΎβ΄ππ, πβ―ππ. β³, and a new entity extractor can be implicitly constructed by applying new instruction and demonstrations to PLMs, i.e., (Ξ» . β³)(instruction, demonstrations) β β± where β± will be a new entity extractor, i.e., β±: text β entities. To inject the above in-context NER ability into PLMs, we propose a meta-function pre-training algorithm, which pre-trains PLMs by comparing the (instruction, demonstration)-initialized extractor with a surrogate golden extractor. Experimental results on 4 few-shot NER datasets show that our method can effectively inject in-context NER ability into PLMs and significantly outperforms the PLMs+fine-tuning counterparts.
READ FULL TEXT