Enhancing Cross-lingual Prompting with Mask Token Augmentation

02/15/2022
by   Meng Zhou, et al.
0

Prompting shows promising results in few-shot scenarios. However, its strength for multilingual/cross-lingual problems has not been fully exploited. Zhao and Schütze (2021) made initial explorations in this direction by presenting that cross-lingual prompting outperforms cross-lingual finetuning. In this paper, we conduct empirical analysis on the effect of each component in cross-lingual prompting and derive Universal Prompting across languages, which helps alleviate the discrepancies between source-language training and target-language inference. Based on this, we propose a mask token augmentation framework to further improve the performance of prompt-based cross-lingual transfer. Notably, for XNLI, our method achieves 46.54 training examples per class, significantly better than 34.99

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset