Dimension reduction in recurrent networks by canonicalization

07/23/2020
by   Lyudmila Grigoryeva, et al.
0

Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The so called input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and time-invariant input/output systems with semi-infinite inputs. A second result uses the notion of optimal reduction borrowed from the theory of symmetric Hamiltonian systems to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are implemented and studied in detail in the framework of linear fading memory input/output systems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset