Subject

Subjects : Knowledge Distillation

  • Articles (27)
  • Patents (0)
  • R&D Reports (0)
논문 검색결과
Type Year Title Cited Download
Journal 2026 Adaptive Progressive Learning for Real-time Fire Source Prediction with Limited Re-training Data   Ahn Yusun  Engineering Applications of Artificial Intelligence, v.166, no.A, pp.1-25 0 원문
Journal 2026 Def-Ag: An energy-efficient decentralized federated learning framework via aggregator clients   박준영  Future Generation Computer Systems, v.175, pp.1-17 0 원문
Conference 2025 Analyzing the Superiority of Logit Standardization in Knowledge Distillation for Efficient Deployment in AAM Environments   Mun Gwon Han  International Conference on Information and Communication Technology Convergence (ICTC) 2025, pp.648-651
Journal 2025 LGD-BEV: Label-Guided Distillation for BEV 3D Object Detection   Lee Jiyong  IEEE Access, v.13, pp.181836-181845 0 원문
Journal 2025 BCRNet-SNN: Body Channel Response-Aware Spiking Neural Network for User Recognition   신찬우  IEEE Transactions on Industrial Informatics, v.21, no.8, pp.6017-6027 1 원문
Journal 2025 BCRNet-SNN: Body Channel Response-Aware Spiking Neural Network for User Recognition   Kang Tae Wook  IEEE Transactions on Industrial Informatics, v.21, no.8, pp.6017-6027 1 원문
Conference 2025 Enhancing Off-Road Semantic Segmentation with Synthetic NIR Generation and Knowledge Distillation   Kim Wonjune  International Conference on Robotics and Automation (ICRA) 2025 : Workshop, pp.1-6
Journal 2024 SITRAN: Self-Supervised IDS With Transferable Techniques for 5G Industrial Environments   Kimhyunjin  IEEE Internet of Things Journal, v.11, no.21, pp.35465-35476 6 원문
Conference 2024 DiT-Pruner: Pruning Diffusion Transformer Models for Text-to-Image Synthesis Using Human Preference Scores   Youngwan Lee  European Conference on Computer Vision (ECCV) 2024, pp.1-9
Conference 2024 PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation   박혜원  Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2024, pp.1-8
Conference 2024 PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation   황인준  Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2024, pp.1-8
Conference 2024 PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation   Youngwan Lee  Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2024, pp.1-8
Journal 2023 Weighted knowledge distillation of attention-LRCN for recognizing affective states from PPG signals   최지호  Expert Systems with Applications, v.233, pp.1-10 9 원문
Journal 2023 Attention-Based Bi-Prediction Network for Versatile Video Coding (VVC) over 5G Network   최영주  Sensors, v.23, no.5, pp.1-19 5 원문
Conference 2022 Rethinking Group Fisher Pruning for Efficient Label-Free Network Compression   Jongryul Lee  British Machine Vision Conference (BMVC) 2022, pp.1-12 1
Journal 2022 A Method of Deep Learning Model Optimization for Image Classification on Edge Device   Hyungkeuk Lee  Sensors, v.22, no.19, pp.1-15 25 원문
Journal 2022 Semi-Supervised Domain Adaptation for Multi-Label Classification on Nonintrusive Load Monitoring   허청환  Sensors, v.22, no.15, pp.1-20 21 원문
Journal 2021 Uncertainty-Aware Knowledge Distillation for Collision Identification of Collaborative Robots   Wookyong Kwon  Sensors, v.21, no.19, pp.1-16 9 원문
Conference 2021 A Study on the Application of Knowledge Distillation in Ship Type Classification Model Development   Jiwon Lee  International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.280-282 0 원문
Journal 2021 Adversarial Optimization-Based Knowledge Transfer of Layer-Wise Dense Flow for Image Classification   Yeo Doyeob  Applied Sciences, v.11, no.8, pp.1-16 2 원문
Conference 2020 Knowledge Distillation based Compact Model Learning Method for Object Detection   Ko Jong-Gook  International Conference on Information and Communication Technology Convergence (ICTC) 2020, pp.1276-1278 4 원문
Conference 2020 Towards Understanding Architectural Effects on Knowledge Distillation   Ik Hee Shin  International Conference on Information and Communication Technology Convergence (ICTC) 2020, pp.1144-1146 0 원문
Journal 2020 Knowledge Transfer for On-Device Deep Reinforcement Learning in Resource Constrained Edge Computing Systems   Ingook Jang  IEEE Access, v.8, pp.146588-146597 36 원문
Journal 2019 Layer‐wise hint‐based training for knowledge transfer in a teacher‐student framework   Ji-Hoon Bae  ETRI Journal, v.41, no.2, pp.242-253 8 원문
Journal 2019 Recent R&D Trends for Lightweight Deep Learning   Lee Yong-Ju  전자통신동향분석, v.34, no.2, pp.40-50 원문
Conference 2017 A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning   임준호  Conference on Computer Vision and Pattern Recognition (CVPR) 2017, pp.7130-7138 1389 원문
Journal 2017 Performance Analysis of Hint-KD Training Approach for the Teacher-Student Framework Using Deep Residual Networks   Ji-Hoon Bae  전자공학회논문지, v.54, no.5, pp.35-41 원문
특허 검색결과
Status Year Patent Name Country Family Pat. KIPRIS
No search results.
연구보고서 검색결과
Type Year Research Project Primary Investigator Download
No search results.