Binary Neural Networks as a general-propose compute paradigm for on-device computer vision

02/08/2022
by   Guhong Nie, et al.
0

For binary neural networks (BNNs) to become the mainstream on-device computer vision algorithm, they must achieve a superior speed-vs-accuracy tradeoff than 8-bit quantization and establish a similar degree of general applicability in vision tasks. To this end, we propose a BNN framework comprising 1) a minimalistic inference scheme for hardware-friendliness, 2) an over-parameterized training scheme for high accuracy, and 3) a simple procedure to adapt to different vision tasks. The resultant framework overtakes 8-bit quantization in the speed-vs-accuracy tradeoff for classification, detection, segmentation, super-resolution and matching: our BNNs not only retain the accuracy levels of their 8-bit baselines but also showcase 1.3-2.4× faster FPS on mobile CPUs. Similar conclusions can be drawn for prototypical systolic-array-based AI accelerators, where our BNNs promise 2.8-7× fewer execution cycles than 8-bit and 2.1-2.7× fewer cycles than alternative BNN designs. These results suggest that the time for large-scale BNN adoption could be upon us.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset