Inducing Early Neural Collapse in Deep Neural Networks for Improved Out-of-Distribution Detection

09/17/2022
by   Jarrod Haas, et al.
0

We propose a simple modification to standard ResNet architectures–L2 regularization over feature space–that substantially improves out-of-distribution (OoD) performance on the previously proposed Deep Deterministic Uncertainty (DDU) benchmark. This change also induces early Neural Collapse (NC), which we show is an effect under which better OoD performance is more probable. Our method achieves comparable or superior OoD detection scores and classification accuracy in a small fraction of the training time of the benchmark. Additionally, it substantially improves worst case OoD performance over multiple, randomly initialized models. Though we do not suggest that NC is the sole mechanism or comprehensive explanation for OoD behaviour in deep neural networks (DNN), we believe NC's simple mathematical and geometric structure can provide an framework for analysis of this complex phenomenon in future work.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset