DiaNet: BERT and Hierarchical Attention Multi-Task Learning of Fine-Grained Dialect

10/31/2019
by   Muhammad Abdul-Mageed, et al.
0

Prediction of language varieties and dialects is an important language processing task, with a wide range of applications. For Arabic, the native tongue of   300 million people, most varieties remain unsupported. To ease this bottleneck, we present a very large scale dataset covering 319 cities from all 21 Arab countries. We introduce a hierarchical attention multi-task learning (HA-MTL) approach for dialect identification exploiting our data at the city, state, and country levels. We also evaluate use of BERT on the three tasks, comparing it to the MTL approach. We benchmark and release our data and models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset