Preserving training dynamics across batch sizes is an important tool for...
Despite progress in vision-based inspection algorithms, real-world indus...
Recent work has shown that it is possible to train an unsupervised
autom...
Training stability is of great importance to Transformers. In this work,...
Self-training has been shown to be helpful in addressing data scarcity f...
Continuous pseudo-labeling (PL) algorithms such as slimIPL have recently...
Self-training (ST) and self-supervised learning (SSL) methods have
demon...
Self-training (ST), or pseudo-labeling has sparked significant interest ...
Transformers have gained increasing popularity in a wide range of
applic...
As the computational requirements for machine learning systems and the s...
Semi-supervised learning through pseudo-labeling has become a staple of
...
In this paper, we study training of automatic speech recognition system ...
In this paper, we introduce the Kaizen framework that uses a continuousl...
Without positional information, attention-based transformer neural netwo...
Self-supervised learning of speech representations has been a very activ...
Self-supervised learning (SSL) has shown promise in learning representat...
Is pushing numbers on a single benchmark valuable in automatic speech
re...
Recent results in end-to-end ASR have demonstrated the efficacy of simpl...
Self-training and unsupervised pre-training have emerged as effective
ap...
Pseudo-labeling has recently shown promise in end-to-end automatic speec...
We design an online end-to-end speech recognition system based on Time-D...
We introduce a new collection of spoken English audio suitable for train...
We study ResNet-, Time-Depth Separable ConvNets-, and Transformer-based
...
Lexicon-free speech recognition naturally deals with the problem of
out-...
In machine learning ensemble methods have demonstrated high accuracy for...
Identifying the flavour of neutral B mesons production is one of the mos...