ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Quantized ADAM with Monotonically Increasing Resolution of Quantization
Cited 1 time in scopus Download 6 time Share share facebook twitter linkedin kakaostory
저자
석진욱, 김정시
발행일
202110
출처
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1719-1721
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9620825
협약과제
21HS1900, 스마트기기를 위한 온디바이스 지능형 정보처리 가속화 SW플랫폼 기술 개발, 김정시
초록
We propose a quantized ADAM learning algorithm that can represent better optimization performance by monotonically reducing the quantization step to time when quantization is composed of integer or fixed-point fractional values applied to an optimization algorithm. According to the White Noise Hypothesis to quantization error with dense and uniform distribution, we can regard the quantization error as an i.i.d. white noise. It leads that we can obtain a stochastic equation about the quantized ADAM learning equation, and we obtain the monotonically decreasing rate of the quantization step that enables the global optimization by the stochastic analysis to the derivation of an objective function. Numerical experiments represent that the proposed algorithm represents a better performance than the conventional ADAM learning schemes on a ResNet for a general image classification test.
KSP 제안 키워드
Fixed-point, Global optimization, Image classification, Numerical experiments, Optimization algorithm, Optimization performance, Quantization error, Uniform Distribution, White noise, decreasing rate, learning algorithms