SiftingGAN: Generating and Sifting Labeled Samples to Improve the Remote Sensing Image Scene Classification Baseline in vitro

09/13/2018
by   Dongao Ma, et al.
0

Lack of annotated samples vastly restrains the direct application of deep learning supervised method in remote sensing scene classification. Many researches try to tackle this issue with the aid of unsupervised learning ability of generative adversarial networks (GANs). However, in these researches, the generated samples are only used inside the GANs for training, which haven't proved the effectiveness of the GAN-generated samples using as augmentation data for training other deep networks. Moreover, traditional image transformation operations such as flip and rotation, are still broadly applied for data augmentation but limited in quantity and diversity. Thus the question whether the GAN-generated samples perform better than the transformed samples remains to be research. Therefore, we propose a SiftingGAN framework to generate more numerous, more diverse, more authentic labeled samples for data augmentation. SiftingGAN extends traditional GAN framework with an Online-Output method for sample generation, a Generative-Model-Sifting method for model sifting, and a Labeled-Sample-Discriminating method for sample sifting. We conduct three groups of control experiments by changing the original-augmented data ratio and applying different augmented samples. The experimental results on AID dataset verify that the samples generated by the proposed SiftingGAN effectively improve the scene classification baseline and perform better than the samples produced by traditional geometric transformation operations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset