Robust Low-Rank Subspace Segmentation with Semidefinite Guarantees
Recently there is a line of research work proposing to employ Spectral Clustering (SC) to segment (group)Throughout the paper, we use segmentation, clustering, and grouping, and their verb forms, interchangeably. high-dimensional structural data such as those (approximately) lying on subspaces We follow liu2010robust and use the term "subspace" to denote both linear subspaces and affine subspaces. There is a trivial conversion between linear subspaces and affine subspaces as mentioned therein. or low-dimensional manifolds. By learning the affinity matrix in the form of sparse reconstruction, techniques proposed in this vein often considerably boost the performance in subspace settings where traditional SC can fail. Despite the success, there are fundamental problems that have been left unsolved: the spectrum property of the learned affinity matrix cannot be gauged in advance, and there is often one ugly symmetrization step that post-processes the affinity for SC input. Hence we advocate to enforce the symmetric positive semidefinite constraint explicitly during learning (Low-Rank Representation with Positive SemiDefinite constraint, or LRR-PSD), and show that factually it can be solved in an exquisite scheme efficiently instead of general-purpose SDP solvers that usually scale up poorly. We provide rigorous mathematical derivations to show that, in its canonical form, LRR-PSD is equivalent to the recently proposed Low-Rank Representation (LRR) scheme liu2010robust, and hence offer theoretic and practical insights to both LRR-PSD and LRR, inviting future research. As per the computational cost, our proposal is at most comparable to that of LRR, if not less. We validate our theoretic analysis and optimization scheme by experiments on both synthetic and real data sets.
READ FULL TEXT