Meta-Learning for Natural Language Understanding under Continual Learning Framework

11/03/2020
by   Jiacheng Wang, et al.
0

Neural network has been recognized with its accomplishments on tackling various natural language understanding (NLU) tasks. Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text. In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks. We validate our methods on selected SuperGLUE and GLUE benchmark.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset