This evidence-based position paper critiques current research practices
...
After just a few hundred training updates, a standard probabilistic mode...
Transformer language models that are trained on vast amounts of data hav...
A benchmark provides an ecosystem to measure the advancement of models w...
Our world is open-ended, non-stationary and constantly evolving; thus wh...
Textual representation learners trained on large amounts of data have
ac...
Prior work has shown that, on small amounts of training data, syntactic
...
Recurrent neural network grammars (RNNG) are generative models of langua...
Recurrent neural network grammars (RNNGs) are generative models of
(tree...
We describe DyNet, a toolkit for implementing neural network models base...
Recurrent neural network grammars (RNNG) are a recently proposed
probabi...
We propose a transition-based dependency parser using Recurrent Neural
N...
We introduce recurrent neural network grammars, probabilistic models of
...