Reformulating Sentence Ordering as Conditional Text Generation
The task of organizing a shuffled set of sentences into a coherent text is important in NLP and has been used to evaluate a machine's understanding of causal and temporal relations. We present Reorder-BART (RE-BART), a sentence ordering framework which leverages a pre-trained transformer-based model to identify a coherent order for a given set of shuffled sentences. We reformulate the task as a conditional text-to-marker generation setup where the input is a set of shuffled sentences with sentence-specific markers and output is a sequence of position markers of the ordered text. Our framework achieves the state-of-the-art performance across six datasets in Perfect Match Ratio (PMR) and Kendall's tau (τ) metric. We perform evaluations in a zero-shot setting, showcasing that our model is able to generalize well across other datasets. We additionally perform a series of experiments to understand the functioning and explore the limitations of our framework.
READ FULL TEXT