Guided contrastive self-supervised pre-training for automatic speech recognition

10/22/2022
by   Aparna Khare, et al.
0

Contrastive Predictive Coding (CPC) is a representation learning method that maximizes the mutual information between intermediate latent representations and the output of a given model. It can be used to effectively initialize the encoder of an Automatic Speech Recognition (ASR) model. We present a novel modification of CPC called Guided Contrastive Predictive Coding (GCPC). Our proposed method maximizes the mutual information between representations from a prior-knowledge model and the output of the model being pre-trained, allowing prior knowledge injection during pre-training. We validate our method on 3 ASR tasks: German, French and English. Our method outperforms CPC pre-training on all three datasets, reducing the Word Error Rate (WER) by 4.44 15.43 respectively, compared to training from scratch, while CPC pre-training only brings 2.96

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset