Robots Racialized in the Likeness of Marginalized Social Identities are Subject to Greater Dehumanization than those racialized as White

08/01/2018
by   Megan Strait, et al.
0

The emergence and spread of humanlike robots into increasingly public domains has revealed a concerning phenomenon: people's unabashed dehumanization of robots, particularly those gendered as female. Here we examined this phenomenon further towards understanding whether other socially marginalized cues (racialization in the likeness of Asian and Black identities), like female-gendering, are associated with the manifestation of dehumanization (e.g., objectification, stereotyping) in human-robot interactions. To that end, we analyzed free-form comments (N=535) on three videos, each depicting a gynoid - Bina48, Nadine, or Yangyang - racialized as Black, White, and Asian respectively. As a preliminary control, we additionally analyzed commentary (N=674) on three videos depicting women embodying similar identity cues. The analyses indicate that people more frequently dehumanize robots racialized as Asian and Black, than they do of robots racialized as White. Additional, preliminary evaluation of how people's responding towards the gynoids compares to that towards other people suggests that the gynoids' ontology (as robots) further facilitates the dehumanization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset