From Pixels to Affect: A Study on Games and Player Experience

07/04/2019
by   Konstantinos Makantasis, et al.
0

Is it possible to predict the affect of a user just by observing her behavioral interaction through a video? How can we, for instance, predict a user's arousal in games by merely looking at the screen during play? In this paper we address these questions by employing three dissimilar deep convolutional neural network architectures in our attempt to learn the underlying mapping between video streams of gameplay and the player's arousal. We test the algorithms in an annotated dataset of 50 gameplay videos of a survival shooter game and evaluate the deep learned models' capacity to classify high vs low arousal levels. Our key findings with the demanding leave-one-video-out validation method reveal accuracies of over 78 and 98 test domain, the findings and methodology are directly relevant to any affective computing area, introducing a general and user-agnostic approach for modeling affect.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset