Neighborhood Consensus Contrastive Learning for Backward-Compatible Representation

08/07/2021
by   Shengsen Wu, et al.
0

In object re-identification (ReID), the development of deep learning techniques often involves model update and deployment. It is unbearable to re-extract image features of the large-scale gallery when deploying new models. Therefore, backward-compatible representation is proposed to enable the "new" features compatible with "old"' features, free from the re-extracting process. The existing backward-compatible methods simply conduct constraints in the embedding space or discriminative space and ignore the intra-class variance of the old embeddings, resulting in a risk of damaging the discriminability of new embeddings. In this work, we propose a Neighborhood Consensus Contrastive Learning (NCCL) method, which learns backward-compatible representation from a neighborhood consensus perspective with both embedding structures and discriminative knowledge. With NCCL, the new embeddings are aligned and improved with old embeddings in a multi-cluster view. Besides, we also propose a scheme to filter the old embeddings with low credibility, which can further improve the compatibility robustness. Our method ensures backward compatibility without impairing the accuracy of the new model. And it can even improve the new model's accuracy in most scenarios.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset