ShuffleNASNets: Efficient CNN models through modified Efficient Neural Architecture Search

12/07/2018
by   Kevin Alexander Laube, et al.
0

Neural network architectures found by sophistic search algorithms achieve strikingly good test performance, surpassing most human-crafted network models by significant margins. Although computationally efficient, their design is often very complex, impairing execution speed. Additionally, finding models outside of the search space is not possible by design. While our space is still limited, we implement undiscoverable expert knowledge into the economic search algorithm Efficient Neural Architecture Search (ENAS), guided by the design principles and architecture of ShuffleNet V2. While maintaining baseline-like 2.85 complex, require fewer parameters, and are two times faster than the ENAS baseline in a classification task. These models also scale well to a low parameter space, achieving less than 5 and only 236K parameters.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset