Like people, LLMs do not always generate the best text for a given gener...
Mathematical reasoning is a fundamental aspect of human intelligence and...
Mathematical reasoning skills are essential for general-purpose intellig...
The formalization of existing mathematical proofs is a notoriously diffi...
The common practice for training commonsense models has gone
from-human-...
Neural sequence models trained with maximum likelihood estimation have l...
The spectacular success of deep generative models calls for quantitative...
Despite its wide use, recent studies have revealed unexpected and undesi...
Understanding and creating mathematics using natural mathematical langua...
Neural autoregressive sequence models are used to generate sequences in ...
Despite strong performance on a variety of tasks, neural sequence models...
Generative dialogue models currently suffer from a number of problems wh...
Neural text generation is a key tool in natural language applications, b...
We propose a method for non-projective dependency parsing by incremental...
Standard sequential generation methods assume a pre-specified generation...
Consistency is a long standing issue faced by dialogue models. In this p...
We study the problem of multiset prediction. The goal of multiset predic...
Humans process visual scenes selectively and sequentially using attentio...