Making Neural Machine Reading Comprehension Faster

03/29/2019
by   Debajyoti Chatterjee, et al.
0

This study aims at solving the Machine Reading Comprehension problem where questions have to be answered given a context passage. The challenge is to develop a computationally faster model which will have improved inference time. State of the art in many natural language understanding tasks, BERT model, has been used and knowledge distillation method has been applied to train two smaller models. The developed models are compared with other models which have been developed with the same intention.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset