LEP-CNN: A Lightweight Edge Device Assisted Privacy-preserving CNN Inference Solution for IoT

01/14/2019
by   Yifan Tian, et al.
0

Supporting convolutional neural network (CNN) inference on resource-constrained IoT devices in a timely manner has been an outstanding challenge for emerging smart systems. To mitigate the burden on IoT devices, the prevailing solution is to offload the CNN inference task, which is usually composed of billions of operations, to public cloud. However, the "offloading-to-cloud" solution may cause privacy breach while moving sensitive data to cloud. For privacy protection, the research community has resorted to advanced cryptographic primitives and approximation techniques to support CNN inference on encrypted data. Consequently, these attempts cause impractical computational overhead on IoT devices and degrade the performance of CNNs. Moreover, relying on the remote cloud can cause additional network latency and even make the system dysfunction when network connection is off. We proposes an extremely lightweight edge device assisted private CNN inference solution for IoT devices, namely LEP-CNN. The main design of LEP-CNN is based on a novel online/offline encryption scheme. The decryption of LEP-CNN is pre-computed offline via utilizing the linear property of the most time-consuming operations of CNNs. As a result, LEP-CNN allows IoT devices to securely offload over 99 inference on encrypted data as efficient as on plaintext. LEP-CNN also provides an integrity check option to help IoT devices detect error results with a successful rate over 99 the CNN inference for more than 35 times for resource constrained IoT devices. A homomorphic encryption based AlexNet using CryptoNets is implemented to compare with LEP-CNN to demonstrate that LEP-CNN has a better performance than homomorphic encryption based privacy preserving neural networks under time-sensitive scenarios.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset