Weight-sharing Supernet for Searching Specialized Acoustic Event Classification Networks Across Device Constraints

03/18/2023
by   Guan-Ting Lin, et al.
0

Acoustic Event Classification (AEC) has been widely used in devices such as smart speakers and mobile phones for home safety or accessibility support. As AEC models run on more and more devices with diverse computation resource constraints, it became increasingly expensive to develop models that are tuned to achieve optimal accuracy/computation trade-off for each given computation resource constraint. In this paper, we introduce a Once-For-All (OFA) Neural Architecture Search (NAS) framework for AEC. Specifically, we first train a weight-sharing supernet that supports different model architectures, followed by automatically searching for a model given specific computational resource constraints. Our experimental results showed that by just training once, the resulting model from NAS significantly outperforms both models trained individually from scratch and knowledge distillation (25.4 improvement). We also found that the benefit of weight-sharing supernet training of ultra-small models comes not only from searching but from optimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset