ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Parameter Reduction For Deep Neural Network Based Acoustic Models Using Sparsity Regularized Factorization Neurons
Cited 4 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
저자
정훈, 정의석, 박전규, 정호영
발행일
201907
출처
International Joint Conference on Neural Networks (IJCNN) 2019, pp.1-5
DOI
https://dx.doi.org/10.1109/IJCNN.2019.8852021
협약과제
19HS2500, 준지도학습형 언어지능 원천기술 및 이에 기반한 외국인 지원용 한국어 튜터링 서비스 개발, 이윤근
초록
In this paper, we propose a deep neural network (DNN) model parameter reduction technique for an efficient acoustic model. One of the most common DNN model parameter reduction techniques is to use low-rank matrix approximation. Although it can reduce a significant number of model parameters, there are two problems to be considered; one is the performance degradation, and the other is the appropriate rank selection. To solve these problems, retraining is carried out, and so-called explained variance is used. However, retraining takes additional time, and explained variance is not directly related to classification performance.Therefore, to mitigate these problems, we propose an approach that performs model parameter reduction simultaneously during model training from the aspect of minimizing classification error. The proposed method uses the product of three factorized matrices instead of a dense weight matrix, and applies sparsity constraint to make entries of the center diagonal matrix zero. After finishing training, a parameter-reduced model can be obtained by discarding the left and right vectors corresponding to zero entries within the center diagonal matrix.
KSP 제안 키워드
Classification Performance, Deep neural network(DNN), Low-Rank Matrix Approximation, Model parameter, Reduced model, Reduction technique, acoustic model, classification error, diagonal matrix, parameter reduction, performance degradation