BehaVR: User Identification Based on VR Sensor Data
Virtual reality (VR) platforms enable a wide range of applications, however pose unique privacy risks. In particular, VR devices are equipped with a rich set of sensors that collect personal and sensitive information (e.g., body motion, eye gaze, hand joints, and facial expression), which can be used to uniquely identify a user, even without explicit identifiers. In this paper, we are interested in understanding the extent to which a user can be identified based on data collected by different VR sensors. We consider adversaries with capabilities that range from observing APIs available within a single VR app (app adversary) to observing all, or selected, sensor measurements across all apps on the VR device (device adversary). To that end, we introduce BEHAVR, a framework for collecting and analyzing data from all sensor groups collected by all apps running on a VR device. We use BEHAVR to perform a user study and collect data from real users that interact with popular real-world apps. We use that data to build machine learning models for user identification, with features extracted from sensor data available within and across apps. We show that these models can identify users with an accuracy of up to 100 reveal the most important features and sensor groups, depending on the functionality of the app and the strength of the adversary, as well as the minimum time needed for user identification. To the best of our knowledge, BEHAVR is the first to analyze user identification in VR comprehensively, i.e., considering jointly all sensor measurements available on a VR device (whether within an app or across multiple apps), collected by real-world, as opposed to custom-made, apps.
READ FULL TEXT