Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer

03/16/2022
by   Huiyuan Lai, et al.
9

We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset