Open-world Contrastive Learning

08/04/2022
by   Yiyou Sun, et al.
0

Recent advance in contrastive learning has shown remarkable performance. However, the vast majority of approaches are limited to the closed-world setting. In this paper, we enrich the landscape of representation learning by tapping into an open-world setting, where unlabeled samples from novel classes can naturally emerge in the wild. To bridge the gap, we introduce a new learning framework, open-world contrastive learning (OpenCon). OpenCon tackles the challenges of learning compact representations for both known and novel classes, and facilitates novelty discovery along the way. We demonstrate the effectiveness of OpenCon on challenging benchmark datasets and establish competitive performance. On the ImageNet dataset, OpenCon significantly outperforms the current best method by 11.9 classification accuracy, respectively. We hope that our work will open up new doors for future work to tackle this important problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset