Large language models generate fluent texts and can follow natural langu...
We present the call for papers for the BabyLM Challenge: Sample-efficien...
Previous studies investigating the syntactic abilities of deep learning
...
Humans can learn structural properties about a word from minimal experie...
State-of-the-art neural network models have achieved dizzyingly low
perp...
Neural language models have achieved state-of-the-art performances on ma...
Deep learning sequence models have led to a marked increase in performan...
Recurrent Neural Networks (RNNs) trained on a language modeling task hav...
We deploy the methods of controlled psycholinguistic experimentation to ...
State-of-the-art LSTM language models trained on large corpora learn
seq...
Recurrent neural networks (RNNs) are the state of the art in sequence
mo...
RNN language models have achieved state-of-the-art perplexity results an...