The COVID-19 Infodemic: Can the Crowd Judge Recent Misinformation Objectively?

by   Kevin Roitero, et al.

Misinformation is an ever increasing problem that is difficult to solve for the research community and has a negative impact on the society at large. Very recently, the problem has been addressed with a crowdsourcing-based approach to scale up labeling efforts: to assess the truthfulness of a statement, instead of relying on a few experts, a crowd of (non-expert) judges is exploited. We follow the same approach to study whether crowdsourcing is an effective and reliable method to assess statements truthfulness during a pandemic. We specifically target statements related to the COVID-19 health emergency, that is still ongoing at the time of the study and has arguably caused an increase of the amount of misinformation that is spreading online (a phenomenon for which the term "infodemic" has been used). By doing so, we are able to address (mis)information that is both related to a sensitive and personal issue like health and very recent as compared to when the judgment is done: two issues that have not been analyzed in related work. In our experiment, crowd workers are asked to assess the truthfulness of statements, as well as to provide evidence for the assessments as a URL and a text justification. Besides showing that the crowd is able to accurately judge the truthfulness of the statements, we also report results on many different aspects, including: agreement among workers, the effect of different aggregation functions, of scales transformations, and of workers background / bias. We also analyze workers behavior, in terms of queries submitted, URLs found / selected, text justifications, and other behavioral data like clicks and mouse actions collected by means of an ad hoc logger.


page 1

page 2

page 3

page 4


Can the Crowd Judge Truthfulness? A Longitudinal Study on Recent Misinformation about COVID-19

Recently, the misinformation problem has been addressed with a crowdsour...

Can The Crowd Identify Misinformation Objectively? The Effects of Judgment Scale and Assessor's Background

Truthfulness judgments are a fundamental step in the process of fighting...

The Many Dimensions of Truthfulness: Crowdsourcing Misinformation Assessments on a Multidimensional Scale

Recent work has demonstrated the viability of using crowdsourcing as a t...

Identifying Terms and Conditions Important to Consumers using Crowdsourcing

Terms and conditions (T Cs) are pervasive on the web and often contain...

Incentivizing an Unknown Crowd

Motivated by the common strategic activities in crowdsourcing labeling, ...

Ex Machina: Personal Attacks Seen at Scale

The damage personal attacks cause to online discourse motivates many pla...

CODA-19: Reliably Annotating Research Aspects on 10,000+ CORD-19 Abstracts Using Non-Expert Crowd

This paper introduces CODA-19, a human-annotated dataset that denotes th...

Please sign up or login with your details

Forgot password? Click here to reset