The Space of Functions Computed By Deep Layered Machines
We study the space of Boolean functions computed by random layered machines, including deep neural networks, and Boolean circuits. Investigating recurrent and layered feed-forward architectures, we find that the spaces of functions realized by both architectures are the same. We show that, depending on the initial conditions and computing elements used, the entropy of Boolean functions computed by deep layered machines is either monotonically increasing or decreasing with growing depth, and characterize the space of functions computed at the large depth limit.
READ FULL TEXT