Evaluating complexity and resilience trade-offs in emerging memory inference machines

02/25/2020
by   Christopher H. Bennett, et al.
0

Neuromorphic-style inference only works well if limited hardware resources are maximized properly, e.g. accuracy continues to scale with parameters and complexity in the face of potential disturbance. In this work, we use realistic crossbar simulations to highlight that compact implementations of deep neural networks are unexpectedly susceptible to collapse from multiple system disturbances. Our work proposes a middle path towards high performance and strong resilience utilizing the Mosaics framework, and specifically by re-using synaptic connections in a recurrent neural network implementation that possesses a natural form of noise-immunity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset