SOGAN: 3D-Aware Shadow and Occlusion Robust GAN for Makeup Transfer

04/21/2021
by   Yueming Lyu, et al.
12

In recent years, virtual makeup applications have become more and more popular. However, it is still challenging to propose a robust makeup transfer method in the real-world environment. Current makeup transfer methods mostly work well on good-conditioned clean makeup images, but transferring makeup that exhibits shadow and occlusion is not satisfying. To alleviate it, we propose a novel makeup transfer method, called 3D-Aware Shadow and Occlusion Robust GAN (SOGAN). Given the source and the reference faces, we first fit a 3D face model and then disentangle the faces into shape and texture. In the texture branch, we map the texture to the UV space and design a UV texture generator to transfer the makeup. Since human faces are symmetrical in the UV space, we can conveniently remove the undesired shadow and occlusion from the reference image by carefully designing a Flip Attention Module (FAM). After obtaining cleaner makeup features from the reference image, a Makeup Transfer Module (MTM) is introduced to perform accurate makeup transfer. The qualitative and quantitative experiments demonstrate that our SOGAN not only achieves superior results in shadow and occlusion situations but also performs well in large pose and expression variations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset