Perceptual Quality Assessment of 360^∘ Images Based on Generative Scanpath Representation
Despite substantial efforts dedicated to the design of heuristic models for omnidirectional (i.e., 360^∘) image quality assessment (OIQA), a conspicuous gap remains due to the lack of consideration for the diversity of viewing behaviors that leads to the varying perceptual quality of 360^∘ images. Two critical aspects underline this oversight: the neglect of viewing conditions that significantly sway user gaze patterns and the overreliance on a single viewport sequence from the 360^∘ image for quality inference. To address these issues, we introduce a unique generative scanpath representation (GSR) for effective quality inference of 360^∘ images, which aggregates varied perceptual experiences of multi-hypothesis users under a predefined viewing condition. More specifically, given a viewing condition characterized by the starting point of viewing and exploration time, a set of scanpaths consisting of dynamic visual fixations can be produced using an apt scanpath generator. Following this vein, we use the scanpaths to convert the 360^∘ image into the unique GSR, which provides a global overview of gazed-focused contents derived from scanpaths. As such, the quality inference of the 360^∘ image is swiftly transformed to that of GSR. We then propose an efficient OIQA computational framework by learning the quality maps of GSR. Comprehensive experimental results validate that the predictions of the proposed framework are highly consistent with human perception in the spatiotemporal domain, especially in the challenging context of locally distorted 360^∘ images under varied viewing conditions. The code will be released at https://github.com/xiangjieSui/GSR
READ FULL TEXT