12:00: Jianfeng (Jeff) Yao: Impact of classification difficulty on the weight matrices spectra in Deep Learning and application to early-stopping
16 mei 2022 12:00 t/m 13:00 | Zet in mijn agenda
Random Matrix Theory (RMT) help understand Deep Learning by analyzing the spectra of large weight matrices of a trained deep neural network (DNN). We conduct extensive experiments on such weight matrices under different settings for layers, networks and data sets. Following the previous work of Martin and Mahoney (2021), spectra of weight matrices at the terminal stage of training are classified into three main types: Light Tail (LT), Bulk Transition period (BT) and Heavy Tail(HT). A main contribution from the paper is that we identify the difficulty of the classification problem as a driving factor for the appearance of HT in weight matrices spectra. Moreover, the classification difficulty can be affected either by the signal-to-noise ratio of the dataset, or by the complexity of the classification problem (complex features, large number of classes) as well. Leveraging on this finding, we further propose a spectral criterion to detect the appearance of HT and use it to early stop the training process without testing data. Such early stopped DNNs have the merit of avoiding overfitting and unnecessary extra training while preserving a much comparable generalization ability. The findings from the paper are validated in several NNs (LeNet, MiniAlexNet and Vgg), using Gaussian synthetic data and real data sets (MNIST and CIFAR10).