CryptoGRU: Low Latency Privacy-Preserving Text Analysis With GRU

10/22/2020
by   Bo Feng, et al.
18

Billions of text analysis requests containing private emails, personal text messages, and sensitive online reviews, are processed by recurrent neural networks (RNNs) deployed on public clouds every day. Although prior secure networks combine homomorphic encryption (HE) and garbled circuit (GC) to preserve users' privacy, naively adopting the HE and GC hybrid technique to implement RNNs suffers from long inference latency due to slow activation functions. In this paper, we present a HE and GC hybrid gated recurrent unit (GRU) network, CryptoGRU, for low-latency secure inferences. CryptoGRU replaces computationally expensive GC-based tanh with fast GC-based ReLU, and then quantizes sigmoid and ReLU with a smaller bit length to accelerate activations in a GRU. We evaluate CryptoGRU with multiple GRU models trained on 4 public datasets. Experimental results show CryptoGRU achieves top-notch accuracy and improves the secure inference latency by up to 138× over one of state-of-the-art secure networks on the Penn Treebank dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset