ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Initializing Deep Learning Based on Latent Dirichlet Allocation for Document Classification
Cited 1 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Hyung-Bae Jeon, Soo-Young Lee
Issue Date
2016-10
Citation
International Conference on Neural Information Processing (ICONIP) 2016 (LNCS 9949), v.9949, pp.634-641
Publisher
Springer
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1007/978-3-319-46675-0_70
Abstract
The gradient-descent learning of deep neural networks is subject to local minima, and good initialization may depend on the tasks. In contrast, for document classification tasks, latent Dirichlet allocation (LDA) was quite successful in extracting topic representations, but its performance was limited by its shallow architecture. In this study, LDA was adopted for efficient layer-by-layer pre-training of deep neural networks for a document classification task. Two-layer feedforward networks were added at the end of the process, and trained using a supervised learning algorithm. With 10 different random initializations, the LDA-based initialization generated a much lower mean and standard deviation for false recognition rates than other state-of-the-art initialization methods. This might demonstrate that the multi-layer expansion of probabilistic generative LDA model is capable of extracting efficient hierarchical topic representations for document classification.
KSP Keywords
Classification task, Deep neural network(DNN), Feedforward networks, Hierarchical topic, Initialization methods, LDA model, Latent dirichlet allocation (lda), Layer expansion, Layer-by-Layer(LbL), Local minima, Pre-Training