Frame-constrained Total Variation Regularization for White Noise Regression
Despite the popularity and practical success of total variation (TV) regularization for function estimation, surprisingly little is known about its theoretical performance in a statistical setting. While TV regularization has been known for quite some time to be minimax optimal for denoising one-dimensional signals, for higher dimensions this remains elusive until today. In this paper we consider frame-constrained TV estimators including many well-known (overcomplete) frames in a white noise regression model, and prove their minimax optimality w.r.t. L^q-risk (1<q≤d+2/d∧ 2) up to a logarithmic factor in any dimension d≥ 1. Overcomplete frames are an established tool in mathematical imaging and signal recovery, and their combination with TV regularization has been shown to give excellent results in practice, which our theory now confirms. Our results rely on a new interpolation inequality to relate the frames and certain Besov spaces to the risk functional.
READ FULL TEXT