Subject

Subjects : Distributed training

  • Articles (7)
  • Patents (3)
  • R&D Reports (0)
논문 검색결과
Type Year Title Cited Download
Conference 2024 Efficient Data-parallel Distributed DNN Training for Big Dataset under Heterogeneous GPU Cluster   Ahn Shin Young  International Conference on Big Data (Big Data) 2024, pp.179-188 0 원문
Conference 2024 Preserving Near-Optimal Gradient Sparsification Cost for Scalable Distributed Deep Learning   Yoon Daegun  International Symposium on Cluster, Cloud and Internet Computing (CCGrid) 2024, pp.307-316 1 원문
Conference 2024 EDDIS: Accelerating Distributed Data-Parallel DNN Training for Heterogeneous GPU Cluster   Ahn Shin Young  International Parallel and Distributed Processing Symposium (IPDPS) 2024, pp.1167-1168 2 원문
Conference 2023 Distributed DNN Training Platform for Heterogenous GPU Cluster   Ahn Shin Young  대한전자공학회 학술 대회 (하계) 2023, pp.2694-2697
Conference 2021 Deep Learning Framework using Scalable Shared Memory Buffer Framework   Eun-Ji Lim  International Conference on Electronics, Information and Communication (ICEIC) 2021, pp.542-544 0 원문
Journal 2020 SoftMemoryBox II: A Scalable, Shared Memory Buffer Framework for Accelerating Distributed Training of Large-Scale Deep Neural Networks   Ahn Shin Young  IEEE Access, v.8, pp.207097-207111 4 원문
Journal 2020 Hardware Resource Analysis in Distributed Training with Edge Devices   박시형  Electronics, v.9, no.1, pp.1-13 5 원문
특허 검색결과
Status Year Patent Name Country Family Pat. KIPRIS
Registered 2021 DISTRIBUTED TRAINING METHOD BETWEEN TERMINAL AND EDGE CLOUD SERVER UNITED STATES
Registered 2021 DISTRIBUTED TRAINING METHOD BETWEEN TERMINAL AND EDGE CLOUD SERVER UNITED STATES
Registered 2023 METHOD AND APPARATUS FOR DISTRIBUTED TRAINING OF ARTIFICIAL INTELLIGENCE MODEL IN CHANNEL-SHARING NETWORK ENVIRONMENT UNITED STATES
연구보고서 검색결과
Type Year Research Project Primary Investigator Download
No search results.