Large multimodal models trained on natural documents, which interleave i...
The crystallization of modeling methods around the Transformer architect...
State-of-the-art neural language models can now be used to solve ad-hoc
...
PromptSource is a system for creating, sharing, and using natural langua...
Large language models have recently been shown to attain reasonable zero...
Pre-training has improved model accuracy for both classification and
gen...
Recent prompt-based approaches allow pretrained language models to achie...
The scale, variety, and quantity of publicly-available NLP datasets has ...
The dominant approach in probing neural networks for linguistic properti...
State-of-the-art natural language processing (NLP) models often learn to...
Transformer-based language models such as BERT provide significant accur...
Magnitude pruning is a widely used strategy for reducing model size in p...
Recent advances in modern Natural Language Processing (NLP) research hav...
Recent advances in modern Natural Language Processing (NLP) research hav...
As Transfer Learning from large-scale pre-trained models becomes more
pr...
We introduce a new approach to generative data-driven dialogue systems (...
Much efforts has been devoted to evaluate whether multi-task learning ca...