ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Initializing Deep Learning Based on Latent Dirichlet Allocation for Document Classification
Cited 1 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
저자
전형배, 이수영
발행일
201610
출처
International Conference on Neural Information Processing (ICONIP) 2016 (LNCS 9949), v.9949, pp.634-641
DOI
https://dx.doi.org/10.1007/978-3-319-46675-0_70
협약과제
16MS1700, 언어학습을 위한 자유발화형 음성대화처리 원천기술 개발, 이윤근
초록
The gradient-descent learning of deep neural networks is subject to local minima, and good initialization may depend on the tasks. In contrast, for document classification tasks, latent Dirichlet allocation (LDA) was quite successful in extracting topic representations, but its performance was limited by its shallow architecture. In this study, LDA was adopted for efficient layer-by-layer pre-training of deep neural networks for a document classification task. Two-layer feedforward networks were added at the end of the process, and trained using a supervised learning algorithm. With 10 different random initializations, the LDA-based initialization generated a much lower mean and standard deviation for false recognition rates than other state-of-the-art initialization methods. This might demonstrate that the multi-layer expansion of probabilistic generative LDA model is capable of extracting efficient hierarchical topic representations for document classification.
키워드
Deep learning, Document classification, Good initialization, Latent dirichlet allocation
KSP 제안 키워드
Classification task, Deep neural network(DNN), Feedforward networks, Hierarchical topic, Initialization methods, LDA model, Latent dirichlet allocation (lda), Layer expansion, Layer-by-Layer(LbL), Local minima, Pre-Training