MATIC: Adaptation and In-situ Canaries for Energy-Efficient Neural Network Acceleration

06/14/2017
by   Sung Kim, et al.
0

- The primary author has withdrawn this paper due to conflict of interest - We present MATIC (Memory-Adaptive Training and In-situ Canaries), a voltage scaling methodology that addresses the SRAM efficiency bottleneck in DNN accelerators. To overscale DNN weight SRAMs, MATIC combines specific characteristics of destructive SRAM reads with the error resilience of neural networks in a memory-adaptive training process. PVT-related voltage margins are eliminated using bit-cells from synaptic weights as in-situ canaries to track runtime environmental variation. Demonstrated on a low-power DNN accelerator fabricated in 65nm CMOS, MATIC enables up to 3.3x total energy reduction, or 18.6x application error reduction.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset