Few-Features Attack to Fool Machine Learning Models through Mask-Based GAN

11/12/2019
by   Feng Chen, et al.
0

GAN is a deep-learning based generative approach to generate contents such as images, languages and speeches. Recently, studies have shown that GAN can also be applied to generative adversarial attack examples to fool the machine-learning models. In comparison with the previous non-learning adversarial example attack approaches, the GAN-based adversarial attack example approach can generate the adversarial samples quickly using the GAN architecture every time facing a new sample after training, but meanwhile needs to perturb the attack samples in great quantities, which results in the unpractical application in reality. To address this issue, we propose a new approach, named Few-Feature-Attack-GAN (FFA-GAN). FFA-GAN has a significant time-consuming advantage than the non-learning adversarial samples approaches and a better non-zero-features performance than the GANbased adversarial sample approaches. FFA-GAN can automatically generate the attack samples in the black-box attack through the GAN architecture instead of the evolutional algorithms or the other non-learning approaches. Besides, we introduce the mask mechanism into the generator network of the GAN architecture to optimize the constraint issue, which can also be regarded as the sparsity problem of the important features. During the training, the different weights of losses of the generator are set in the different training phases to ensure the divergence of the two above mentioned parallel networks of the generator. Experiments are made respectively on the structured data sets KDD-Cup 1999 and CIC-IDS 2017, in which the dimensions of the data are relatively low, and also on the unstructured data sets MNIST and CIFAR-10 with the data of the relatively high dimensions. The results of the experiments demonstrate the effectiveness and the robustness of our proposed approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset