Humans can learn languages from remarkably little experience. Developing...
When acquiring syntax, children consistently choose hierarchical rules o...
Machine translation has seen rapid progress with the advent of
Transform...
What explains the dramatic progress from 20th-century to 21st-century AI...
Current language models can generate high-quality text. Are they simply
...
As the name implies, contextualized representations of language are typi...
How do learners acquire languages from the limited data available to the...
Sequence-based neural networks show significant sensitivity to syntactic...
Pretrained neural models such as BERT, when fine-tuned to perform natura...
Learners that are exposed to the same training data might generalize
dif...
If the same neural architecture is trained multiple times on the same
da...
Contextualized representation models such as ELMo (Peters et al., 2018a)...
We introduce a set of nine challenge tasks that test for the understandi...
Machine learning systems can often achieve high performance on a test se...
Work on the problem of contextualized word representation -- the develop...
Recurrent neural networks (RNNs) can learn continuous vector representat...
Neural network models have shown great success at natural language infer...
Syntactic rules in human language usually refer to the hierarchical stru...