What makes multilingual BERT multilingual?

10/20/2020
by   Chi-Liang Liu, et al.
12

Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings. In this work, we provide an in-depth experimental study to supplement the existing literature of cross-lingual ability. We compare the cross-lingual ability of non-contextualized and contextualized representation model with the same data. We found that datasize and context window size are crucial factors to the transferability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset