To fully evaluate the overall performance of different NLP models in a g...
Existing knowledge-enhanced methods have achieved remarkable results in
...
Knowledge distillation is of key importance to launching multilingual
pr...
Parameter regularization or allocation methods are effective in overcomi...
Recent efforts of multimodal Transformers have improved Visually Rich
Do...
Although pre-trained language models (PLMs) have achieved state-of-the-a...
Gazetteer is widely used in Chinese named entity recognition (NER) to en...
Due to the lack of insufficient data, existing multi-hop open domain que...
Knowledge enhanced pre-trained language models (K-PLMs) are shown to be
...
In order to facilitate natural language understanding, the key is to eng...