BERT-Based Arabic Social Media AuthorProfiling

09/09/2019
by   Chiyu Zhang, et al.
0

We report our models for detecting age, language variety, and gender from social media data in the context of the Arabic author profiling and deception detection shared task (APDA). We build simple models based on pre-trained bidirectional encoders from transformers (BERT). We first fine-tune the pre-trained BERT model on each of the three datasets with shared task released data. Then we augment shared task data with in-house data for gender and dialect, showing the utility of augmenting training data. Our best models on the shared task test data are acquired with a majority voting of various BERT models trained under different data conditions. We acquire 54.72 age, 93.75 the three tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset