Real-Time Optimized N-gram For Mobile Devices

01/07/2021
by   Sharmila Mani, et al.
0

With the increasing number of mobile devices, there has been continuous research on generating optimized Language Models (LMs) for soft keyboard. In spite of advances in this domain, building a single LM for low-end feature phones as well as high-end smartphones is still a pressing need. Hence, we propose a novel technique, Optimized N-gram (Op-Ngram), an end-to-end N-gram pipeline that utilises mobile resources efficiently for faster Word Completion (WC) and Next Word Prediction (NWP). Op-Ngram applies Stupid Backoff and pruning strategies to generate a light-weight model. The LM loading time on mobile is linear with respect to model size. We observed that Op-Ngram gives 37 loading time and 89 variant of BerkeleyLM. Moreover, our method shows significant performance improvement over KenLM as well.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset