Sciweavers

ECAI
2010
Springer

Unsupervised Layer-Wise Model Selection in Deep Neural Networks

14 years 16 days ago
Unsupervised Layer-Wise Model Selection in Deep Neural Networks
Abstract. Deep Neural Networks (DNN) propose a new and efficient ML architecture based on the layer-wise building of several representation layers. A critical issue for DNNs remains model selection, e.g. selecting the number of neurons in each DNN layer. The hyper-parameter search space exponentially increases with the number of layers, making the popular grid search-based approach used for finding good hyper-parameter values intractable. The question investigated in this paper is whether the unsupervised, layer-wise methodology used to train a DNN can be extended to model selection as well. The proposed approach, considering an unsupervised criterion, empirically examines whether model selection is a modular optimization problem, and can be tackled in a layer-wise manner. Preliminary results on the MNIST data set suggest the answer is positive. Further, some unexpected results regarding the optimal size of layers depending on the training process, are reported and discussed.
Ludovic Arnold, Hélène Paugam-Moisy,
Added 08 Nov 2010
Updated 08 Nov 2010
Type Conference
Year 2010
Where ECAI
Authors Ludovic Arnold, Hélène Paugam-Moisy, Michèle Sebag
Comments (0)