While code-mixing is a common linguistic practice in many parts of the w...
As language models grow ever larger, the need for large-scale high-quali...
Biomedical data and benchmarks are highly valuable yet very limited in
l...
We present ViT5, a pretrained Transformer-based encoder-decoder model fo...
Text summarization is a challenging task within natural language process...
In this paper, we propose SPBERT, a transformer-based language model
pre...
In this paper, we propose a Hierarchical Transformer model for Vietnames...
We present CoTexT, a pre-trained, transformer-based encoder-decoder mode...
This paper proposed several transformer-based approaches for Reliable
In...