On the use of hybrid coarse-level models in multilevel minimization methods

11/28/2022
by   Alena Kopaničáková, et al.
0

Solving large-scale nonlinear minimization problems is computationally demanding. Nonlinear multilevel minimization (NMM) methods explore the structure of the underlying minimization problem to solve such problems in a computationally efficient and scalable manner. The efficiency of the NMM methods relies on the quality of the coarse-level models. Traditionally, coarse-level models are constructed using the additive approach, where the so-called τ-correction enforces a local coherence between the fine-level and coarse-level objective functions. In this work, we extend this methodology and discuss how to enforce local coherence between the objective functions using a multiplicative approach. Moreover, we also present a hybrid approach, which takes advantage of both, additive and multiplicative, approaches. Using numerical experiments from the field of deep learning, we show that employing a hybrid approach can greatly improve the convergence speed of NMM methods and therefore it provides an attractive alternative to the almost universally used additive approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset