Learning Time Delay Systems with Neural Ordinary Differential Equations

06/28/2022
by   Xunbi A. Ji, et al.
0

A novel way of using neural networks to learn the dynamics of time delay systems from sequential data is proposed. A neural network with trainable delays is used to approximate the right hand side of a delay differential equation. We relate the delay differential equation to an ordinary differential equation by discretizing the time history and train the corresponding neural ordinary differential equation (NODE) to learn the dynamics. An example on learning the dynamics of the Mackey-Glass equation using data from chaotic behavior is given. After learning both the nonlinearity and the time delay, we demonstrate that the bifurcation diagram of the neural network matches that of the original system.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset