RankingMatch: Delving into Semi-Supervised Learning with Consistency Regularization and Ranking Loss

10/09/2021
by   Trung Q. Tran, et al.
0

Semi-supervised learning (SSL) has played an important role in leveraging unlabeled data when labeled data is limited. One of the most successful SSL approaches is based on consistency regularization, which encourages the model to produce unchanged with perturbed input. However, there has been less attention spent on inputs that have the same label. Motivated by the observation that the inputs having the same label should have the similar model outputs, we propose a novel method, RankingMatch, that considers not only the perturbed inputs but also the similarity among the inputs having the same label. We especially introduce a new objective function, dubbed BatchMean Triplet loss, which has the advantage of computational efficiency while taking into account all input samples. Our RankingMatch achieves state-of-the-art performance across many standard SSL benchmarks with a variety of labeled data amounts, including 95.13 on CIFAR-100 with 10000 labels, 97.76 97.77 prove the efficacy of the proposed BatchMean Triplet loss against existing versions of Triplet loss.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset