Optimizing Reusable Knowledge for Continual Learning via Metalearning

06/09/2021
by   Julio Hurtado, et al.
6

When learning tasks over time, artificial neural networks suffer from a problem known as Catastrophic Forgetting (CF). This happens when the weights of a network are overwritten during the training of a new task causing forgetting of old information. To address this issue, we propose MetA Reusable Knowledge or MARK, a new method that fosters weight reusability instead of overwriting when learning a new task. Specifically, MARK keeps a set of shared weights among tasks. We envision these shared weights as a common Knowledge Base (KB) that is not only used to learn new tasks, but also enriched with new knowledge as the model learns new tasks. Key components behind MARK are two-fold. On the one hand, a metalearning approach provides the key mechanism to incrementally enrich the KB with new knowledge and to foster weight reusability among tasks. On the other hand, a set of trainable masks provides the key mechanism to selectively choose from the KB relevant weights to solve each task. By using MARK, we achieve state of the art results in several popular benchmarks, surpassing the best performing methods in terms of average accuracy by over 10 on the 20-Split-MiniImageNet dataset, while achieving almost zero forgetfulness using 55 evidence that, indeed, MARK is learning reusable knowledge that is selectively used by each task.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset