Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with ℓ_0-constraints

09/17/2023
by   Weifeng Yang, et al.
0

Tensor data represents a multidimensional array. Regression methods based on low-rank tensor decomposition leverage structural information to reduce the parameter count. Multilinear logistic regression serves as a powerful tool for the analysis of multidimensional data. To improve its efficacy and interpretability, we present a Multilinear Sparse Logistic Regression model with ℓ_0-constraints (ℓ_0-MLSR). In contrast to the ℓ_1-norm and ℓ_2-norm, the ℓ_0-norm constraint is better suited for feature selection. However, due to its nonconvex and nonsmooth properties, solving it is challenging and convergence guarantees are lacking. Additionally, the multilinear operation in ℓ_0-MLSR also brings non-convexity. To tackle these challenges, we propose an Accelerated Proximal Alternating Linearized Minimization with Adaptive Momentum (APALM^+) method to solve the ℓ_0-MLSR model. We provide a proof that APALM^+ can ensure the convergence of the objective function of ℓ_0-MLSR. We also demonstrate that APALM^+ is globally convergent to a first-order critical point as well as establish convergence rate by using the Kurdyka-Lojasiewicz property. Empirical results obtained from synthetic and real-world datasets validate the superior performance of our algorithm in terms of both accuracy and speed compared to other state-of-the-art methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset