ETRI-Knowledge Sharing Plaform

KOREAN
특허 검색
Status Country
Year ~ Keyword

Detail

Registered PARAMETER SERVER AND METHOD FOR SHARING DISTRIBUTED DEEP LEARNING PARAMETER USING THE SAME

Inventors
Ahn Shin Young, Eun-Ji Lim, Yongseok Choi, Young Choon Woo, Kang Dong Jae, Choi Wan
Application No.
17216322 (2021.03.29)
Publication No.
20210216495 (2021.07.15)
Registration No.
11487698 (2022.11.01)
Country
UNITED STATES
Project Code
16HS1200, Development of HPC System for Accelerating Large-scale Deep Learning, Choi Wan
Abstract
Disclosed herein are a parameter server and a method for sharing distributed deep-learning parameters using the parameter server. The method for sharing distributed deep-learning parameters using the parameter server includes initializing a global weight parameter in response to an initialization request by a master process; performing an update by receiving a learned local gradient parameter from the worker process, which performs deep-learning training after updating a local weight parameter using the global weight parameter; accumulating the gradient parameters in response to a request by the master process; and performing an update by receiving the global weight parameter from the master process that calculates the global weight parameter using the accumulated gradient parameters of the one or more worker processes.
KSP Keywords
Learning parameters, Learning training, Local Gradient(LG), Parameter server, deep learning(DL)
Family
 
패밀리 특허 목록
Status Patent Country KIPRIS
Registered PARAMETER SERVER AND METHOD FOR SHARING DISTRIBUTED DEEP LEARNING PARAMETER USING THE SAME KOREA
Registered The method of deep learning parameter sharing based on remote shared memory parameter server UNITED STATES