ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Quantized ADAM with Monotonically Increasing Resolution of Quantization
Cited 3 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Jinwuk Seok, Jeong-Si Kim
Issue Date
2021-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1719-1721
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9620825
Abstract
We propose a quantized ADAM learning algorithm that can represent better optimization performance by monotonically reducing the quantization step to time when quantization is composed of integer or fixed-point fractional values applied to an optimization algorithm. According to the White Noise Hypothesis to quantization error with dense and uniform distribution, we can regard the quantization error as an i.i.d. white noise. It leads that we can obtain a stochastic equation about the quantized ADAM learning equation, and we obtain the monotonically decreasing rate of the quantization step that enables the global optimization by the stochastic analysis to the derivation of an objective function. Numerical experiments represent that the proposed algorithm represents a better performance than the conventional ADAM learning schemes on a ResNet for a general image classification test.
KSP Keywords
Image Classification, Numerical experiments, Objective function, Optimization algorithm, Optimization performance, Quantization Error, Uniform distribution, White Noise, decreasing rate, fixed point, global optimization