What happens when self-supervision meets Noisy Labels?

10/13/2019
by   Devraj Mandal, et al.
30

The major driving force behind the immense success of deep learning models is the availability of large datasets along with their clean labels. Unfortunately, this is very difficult to obtain, which has motivated research on the training of deep models in the presence of label noise and ways to avoid over-fitting on the noisy labels. In this work, we build upon the seminal work in this area, Co-teaching and propose a simple, yet efficient approach termed mCT-S2R (modified co-teaching with self-supervision and re-labeling) for this task. First, to deal with significant amount of noise in the labels, we propose to use self-supervision to generate robust features without using any labels. Next, using a parallel network architecture, an estimate of the clean labeled portion of the data is obtained. Finally, using this data, a portion of the estimated noisy labeled portion is re-labeled, before resuming the network training with the augmented data. Extensive experiments on three standard datasets show the effectiveness of the proposed framework.

READ FULL TEXT

page 4

page 9

page 11

page 12

research
08/06/2019

Deep Self-Learning From Noisy Labels

ConvNets achieve good results when training from clean data, but learnin...
research
05/31/2017

Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks

Collecting large training datasets, annotated with high-quality labels, ...
research
03/28/2019

Handling Noisy Labels for Robustly Learning from Self-Training Data for Low-Resource Sequence Labeling

In this paper, we address the problem of effectively self-training neura...
research
11/12/2022

Robust Training of Graph Neural Networks via Noise Governance

Graph Neural Networks (GNNs) have become widely-used models for semi-sup...
research
01/24/2021

Analysing the Noise Model Error for Realistic Noisy Label Data

Distant and weak supervision allow to obtain large amounts of labeled tr...
research
09/09/2019

Self-Teaching Networks

We propose self-teaching networks to improve the generalization capacity...
research
06/27/2022

Compressing Features for Learning with Noisy Labels

Supervised learning can be viewed as distilling relevant information fro...

Please sign up or login with your details

Forgot password? Click here to reset