Bayesian matrix completion with a spectral scaled Student prior: theoretical guarantee and efficient sampling
We study the problem of matrix completion in this paper. A spectral scaled Student prior is exploited to favour the underlying low-rank structure of the data matrix. Importantly, we provide a thorough theoretical investigation for our approach, while such an analysis is hard to obtain and limited in theoretical understanding of Bayesian matrix completion. More precisely, we show that our Bayesian approach enjoys a minimax-optimal oracle inequality which guarantees that our method works well under model misspecification and under general sampling distribution. Interestingly, we also provide efficient gradient-based sampling implementations for our approach by using Langevin Monte Carlo which is novel in Bayesian matrix completion. More specifically, we show that our algorithms are significantly faster than Gibbs sampler in this problem. To illustrate the attractive features of our inference strategy, some numerical simulations are conducted and an application to image inpainting is demonstrated.
READ FULL TEXT