Differentiable Greedy Submodular Maximization with Guarantees and Gradient Estimators

05/06/2020
by   Shinsaku Sakaue, et al.
0

We consider making outputs of the greedy algorithm for monotone submodular function maximization differentiable w.r.t. parameters of objective functions. Due to the non-continuous behavior of the algorithm, we must use some smoothing methods. Our contribution is a theoretically guaranteed and widely applicable smoothing framework based on randomization. We prove that our smoothed greedy algorithm almost recovers original approximation guarantees in expectation for the cases of cardinality and κ-extensible system constrains. We also show that unbiased gradient estimators of any expected output-dependent quantities can be efficiently obtained by sampling outputs. We confirm the utility and effectiveness of our framework by applying it to sensitivity analysis of the greedy algorithm and decision-focused learning of parameterized submodular models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset