Weak Supermodularity Assists Submodularity-based Approaches to Non-convex Constrained Optimization

05/29/2018
by   Shinsaku Sakaue, et al.
0

Non-convex constrained optimization problems have many applications in machine learning. To obtain theoretically-guaranteed methods for the problems, restricted strong convexity (RSC) and restricted smoothness (RSM) are often considered. Thanks to recent studies, it has been revealed that RSC and RSM are related to weak submodularity, and thus weakly submodular maximization is expected to be a powerful approach to non-convex constrained optimization. In this paper, we extend the submodularity-based approaches using weak supermodularity. We show that various set functions with (weak) submodularity also exhibit weak supermodularity and that this fact yields beneficial theoretical results. Specifically, we first show that weak submodularity and weak supermodularity are implied by RSC and RSM; using this finding we present a fixed-parameter-tractable approximation algorithm for ℓ_0-constrained minimization. We then consider non-convex optimization with submodular cost constraints. We show that submodular cost functions typically exhibit weak supermodularity, which we use to provide theoretical guarantees for a cost-benefit-greedy algorithm and an iterative-hard-thresholding-style algorithm. We compare the performance of these methods experimentally.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset