Bag of Tricks for Neural Architecture Search

07/08/2021
by   Thomas Elsken, et al.
0

While neural architecture search methods have been successful in previous years and led to new state-of-the-art performance on various problems, they have also been criticized for being unstable, being highly sensitive with respect to their hyperparameters, and often not performing better than random search. To shed some light on this issue, we discuss some practical considerations that help improve the stability, efficiency and overall performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset