Deep neural networks for smooth approximation of physics with higher order and continuity B-spline base functions

01/03/2022
by   Kamil Doległo, et al.
60

This paper deals with the following important research question. Traditionally, the neural network employs non-linear activation functions concatenated with linear operators to approximate a given physical phenomenon. They "fill the space" with the concatenations of the activation functions and linear operators and adjust their coefficients to approximate the physical phenomena. We claim that it is better to "fill the space" with linear combinations of smooth higher-order B-splines base functions as employed by isogeometric analysis and utilize the neural networks to adjust the coefficients of linear combinations. In other words, the possibilities of using neural networks for approximating the B-spline base functions' coefficients and by approximating the solution directly are evaluated. Solving differential equations with neural networks has been proposed by Maziar Raissi et al. in 2017 by introducing Physics-informed Neural Networks (PINN), which naturally encode underlying physical laws as prior information. Approximation of coefficients using a function as an input leverages the well-known capability of neural networks being universal function approximators. In essence, in the PINN approach the network approximates the value of the given field at a given point. We present an alternative approach, where the physcial quantity is approximated as a linear combination of smooth B-spline basis functions, and the neural network approximates the coefficients of B-splines. This research compares results from the DNN approximating the coefficients of the linear combination of B-spline basis functions, with the DNN approximating the solution directly. We show that our approach is cheaper and more accurate when approximating smooth physical fields.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset