Generative Adversarial Neural Architecture Search

Despite the empirical success of neural architecture search (NAS) in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to assess. In this paper, we propose Generative Adversarial NAS (GA-NAS) with theoretically provable convergence guarantees, promoting stability and reproducibility in neural architecture search. Inspired by importance sampling, GA-NAS iteratively fits a generator to previously discovered top architectures, thus increasingly focusing on important parts of a large search space. Furthermore, we propose an efficient adversarial learning approach, where the generator is trained by reinforcement learning based on rewards provided by a discriminator, thus being able to explore the search space without evaluating a large number of architectures. Extensive experiments show that GA-NAS beats the best published results under several cases on three public NAS benchmarks. In the meantime, GA-NAS can handle ad-hoc search constraints and search spaces. We show that GA-NAS can be used to improve already optimized baselines found by other NAS methods, including EfficientNet and ProxylessNAS, in terms of ImageNet accuracy or the number of parameters, in their original search space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2021

Going Beyond Neural Architecture Search with Sampling-based Neural Ensemble Search

Recently, Neural Architecture Search (NAS) has been widely applied to au...
research
11/28/2022

GraphPNAS: Learning Distribution of Good Neural Architectures via Deep Graph Generative Models

Neural architectures can be naturally viewed as computational graphs. Mo...
research
10/27/2021

A Novel Sleep Stage Classification Using CNN Generated by an Efficient Neural Architecture Search with a New Data Processing Trick

With the development of automatic sleep stage classification (ASSC) tech...
research
07/22/2022

Guided Evolutionary Neural Architecture Search With Efficient Performance Estimation

Neural Architecture Search (NAS) methods have been successfully applied ...
research
12/01/2021

Training BatchNorm Only in Neural Architecture Search and Beyond

This work investigates the usage of batch normalization in neural archit...
research
07/11/2022

Long-term Reproducibility for Neural Architecture Search

It is a sad reflection of modern academia that code is often ignored aft...
research
08/20/2021

Lessons from the Clustering Analysis of a Search Space: A Centroid-based Approach to Initializing NAS

Lots of effort in neural architecture search (NAS) research has been ded...

Please sign up or login with your details

Forgot password? Click here to reset