To What Degree Can Language Borders Be Blurred In BERT-based Multilingual Spoken Language Understanding?

11/10/2020
by   Quynh Do, et al.
0

This paper addresses the question as to what degree a BERT-based multilingual Spoken Language Understanding (SLU) model can transfer knowledge across languages. Through experiments we will show that, although it works substantially well even on distant language groups, there is still a gap to the ideal multilingual performance. In addition, we propose a novel BERT-based adversarial model architecture to learn language-shared and language-specific representations for multilingual SLU. Our experimental results prove that the proposed model is capable of narrowing the gap to the ideal multilingual performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset