Deep adaptive basis Galerkin method for high-dimensional evolution equations with oscillatory solutions

12/29/2021
by   Yiqi Gu, et al.
0

In this paper, we study deep neural networks (DNNs) for solving high-dimensional evolution equations with oscillatory solutions. Different from deep least-squares methods that deal with time and space variables simultaneously, we propose a deep adaptive basis Galerkin (DABG) method which employs the spectral-Galerkin method for time variable by tensor-product basis for oscillatory solutions and the deep neural network method for high-dimensional space variables. The proposed method can lead to a linear system of differential equations having unknown DNNs that can be trained via the loss function. We establish a posterior estimates of the solution error which is bounded by the minimal loss function and the term O(N^-m), where N is the number of basis functions and m characterizes the regularity of the equation, and show that if the true solution is a Barron-type function, the error bound converges to zero as M=O(N^p) approaches to infinity where M is the width of the used networks and p is a positive constant. Numerical examples including high-dimensional linear parabolic and hyperbolic equations, and nonlinear Allen-Cahn equation are presented to demonstrate the performance of the proposed DABG method is better than that of existing DNNs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset