As we embark on a new era of LLMs, it becomes increasingly crucial to
un...
The integration of retrieved passages and large language models (LLMs), ...
Question-answering (QA) tasks often investigate specific question types,...
The dominant paradigm of textual question answering systems is based on
...
Cross-lingual transfer of language models trained on high-resource langu...
Dense retrievers have made significant strides in obtaining state-of-the...
Parsing natural language questions into executable logical forms is a us...
Abstractive summarization systems leveraging pre-training language model...
Fusion-in-decoder (Fid) (Izacard and Grave, 2020) is a generative questi...
While both extractive and generative readers have been successfully appl...
Dense neural text retrieval has achieved promising results on open-domai...
Existing KBQA approaches, despite achieving strong performance on i.i.d....
Task-adaptive pre-training (TAPT) and Self-training (ST) have emerged as...
Graph-to-text generation has benefited from pre-trained language models
...
We propose Dynamic Blocking, a decoding algorithm which enables large-sc...
Dialogue state trackers have made significant progress on benchmark data...
Task-oriented dialogue is often decomposed into three tasks: understandi...
Task-oriented dialog presents a difficult challenge encompassing multipl...
A significant barrier to progress in data-driven approaches to building
...
Recent advances in neural sequence-to-sequence models have led to promis...
Understanding and conversing about dynamic scenes is one of the key
capa...
Simultaneous machine translation begins to translate each source sentenc...
Lingvo is a Tensorflow framework offering a complete solution for
collab...
Recent studies have shown that embedding textual relations using deep ne...