Semi-Supervised Few-Shot Learning with Local and Global Consistency

03/06/2019
by   Ahmed Ayyad, et al.
0

Learning from a few examples is a key characteristic of human intelligence that AI researchers have been excited about modeling. With the web-scale data being mostly unlabeled, few recent works showed that few-shot learning performance can be significantly improved with access to unlabeled data, known as semi-supervised few shot learning (SS-FSL). We introduce a SS-FSL approach that we denote as Consistent Prototypical Networks (CPN), which builds on top of Prototypical Networks. We propose new loss terms to leverage unlabelled data, by enforcing notions of local and global consistency. Our work shows the effectiveness of our consistency losses in semi-supervised few shot setting. Our model outperforms the state-of-the-art in most benchmarks, showing large improvements in some cases. For example, in one mini-Imagenet 5-shot classification task, we obtain 70.1 Moreover, our semi-supervised model, trained with 40 well against the vanilla prototypical network trained on 100 even outperforming it in the 1-shot mini-Imagenet case with 51.03 accuracy. For reproducibility, we make our code publicly available.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset