Existing large language models have to run K times to generate a sequenc...
Multilingual pre-trained language models have demonstrated impressive
(z...
Harvesting question-answer (QA) pairs from customer service chatlog in t...
We present DualNER, a simple and effective framework to make full use of...
Contrastive learning has become a new paradigm for unsupervised sentence...
Most current multi-modal summarization methods follow a cascaded manner,...
Unsupervised summarization methods have achieved remarkable results by
i...
Pretrained language models (PLMs) trained on large-scale unlabeled corpu...
Large amounts of data has made neural machine translation (NMT) a big su...
Multi-choice Machine Reading Comprehension (MMRC) aims to select the cor...
Deep encoders have been proven to be effective in improving neural machi...
In encoder-decoder neural models, multiple encoders are in general used ...
Neural architecture search (NAS) has advanced significantly in recent ye...