ETRI-Knowledge Sharing Plaform



논문 검색
구분 SCI
연도 ~ 키워드


학술지 Densely Distilled Flow-Based Knowledge Transfer in Teacher-Student Framework for Image Classification
Cited 20 time in scopus Download 6 time Share share facebook twitter linkedin kakaostory
배지훈, 여도엽, 임준호, 김내수, 표철식, 김준모
IEEE Transactions on Image Processing, v.29, pp.5698-5710
We propose a new teacher-student framework (TSF)-based knowledge transfer method, in which knowledge in the form of dense flow across layers is distilled from a pre-trained 'teacher' deep neural network (DNN) and transferred to another 'student' DNN. In the case of distilled knowledge, multiple overlapped flow-based items of information from the pre-trained teacher DNN are densely extracted across layers. Transference of the densely extracted teacher information is then achieved in the TSF using repetitive sequential training from bottom to top between the teacher and student DNN models. In other words, to efficiently transmit extracted useful teacher information to the student DNN, we perform bottom-up step-by-step transfer of densely distilled knowledge. The performance of the proposed method in terms of image classification accuracy and fast optimization is compared with those of existing TSF-based knowledge transfer methods for application to reliable image datasets, including CIFAR-10, CIFAR-100, MNIST, and SVHN. When the dense flow-based sequential knowledge transfer scheme is employed in the TSF, the trained student ResNet more accurately reflects the rich information of the pre-trained teacher ResNet and exhibits superior accuracy to the existing TSF-based knowledge transfer methods for all benchmark datasets considered in this study.
KSP 제안 키워드
Benchmark datasets, CIFAR-10, Deep neural network(DNN), Dense flow, Fast optimization, Flow-based, Image classification, Image datasets, Knowledge transfer, Step-by-step, Transfer method