CRAFT: Criticality-Aware Fault-Tolerance Enhancement Techniques for Emerging Memories-Based Deep Neural Networks

02/08/2023
by   Thai-Hoang Nguyen, et al.
3

Deep Neural Networks (DNNs) have emerged as the most effective programming paradigm for computer vision and natural language processing applications. With the rapid development of DNNs, efficient hardware architectures for deploying DNN-based applications on edge devices have been extensively studied. Emerging Non-Volatile Memories (NVMs), with their better scalability, non-volatility and good read performance, are found to be promising candidates for deploying DNNs. However, despite the promise, emerging NVMs often suffer from reliability issues such as stuck-at faults, which decrease the chip yield/memory lifetime and severely impact the accuracy of DNNs. A stuck-at cell can be read but not reprogrammed, thus, stuck-at faults in NVMs may or may not result in errors depending on the data to be stored. By reducing the number of errors caused by stuck-at faults, the reliability of a DNN-based system can be enhanced. This paper proposes CRAFT, i.e., Criticality-Aware Fault-Tolerance Enhancement Techniques to enhance the reliability of NVM-based DNNs in the presence of stuck-at faults. A data block remapping technique is used to reduce the impact of stuck-at faults on DNNs accuracy. Additionally, by performing bit-level criticality analysis on various DNNs, the critical-bit positions in network parameters that can significantly impact the accuracy are identified. Based on this analysis, we propose an encoding method which effectively swaps the critical bit positions with that of non-critical bits when more errors (due to stuck-at faults) are present in the critical bits.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset