It's your turn! – A collaborative human-robot pick-and-place scenario in a virtual industrial setting

05/28/2021
by   Brigitte Krenn, et al.
0

In human-robot collaborative interaction scenarios, nonverbal communication plays an important role. Both, signals sent by a human collaborator need to be identified and interpreted by the robotic system, and the signals sent by the robot need to be identified and interpreted by the human. In this paper, we focus on the latter. We implemented on an industrial robot in a VR environment nonverbal behavior signalling the user that it is now their turn to proceed with a pick-and-place task. The signals were presented in four different test conditions: no signal, robot arm gesture, light signal, combination of robot arm gesture and light signal. Test conditions were presented to the participants in two rounds. The qualitative analysis was conducted with focus on (i) potential signals in human behaviour indicating why some participants immediately took over from the robot whereas others needed more time to explore, (ii) human reactions after the nonverbal signal of the robot, and (iii) whether participants showed different behaviours in the different test conditions. We could not identify potential signals why some participants were immediately successful and others not. There was a bandwidth of behaviors after the robot stopped working, e.g. participants rearranged the objects, looked at the robot or the object, or gestured the robot to proceed. We found evidence that robot deictic gestures were helpful for the human to correctly interpret what to do next. Moreover, there was a strong tendency that humans interpreted the light signal projected on the robot's gripper as a request to give the object in focus to the robot. Whereas a robot's pointing gesture at the object was a strong trigger for the humans to look at the object.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset