ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Adversarial Training With Stochastic Weight Average
Cited 8 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Joong-won Hwang, Youngwan Lee, Sungchan Oh, Yuseok Bae
Issue Date
2021-09
Citation
International Conference on Image Processing (ICIP) 2021, pp.814-818
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICIP42928.2021.9506548
Abstract
Although adversarial training is the most reliable method to train robust deep neural networks so far, adversarially trained networks still show large gap between their accuracies on clean images and those on adversarial images. In conventional classification problem, one can gain higher accuracy by ensembling multiple networks. However, in adversarial training, there are obstacles to adopt such ensemble method. First, as inner maximization is expensive, training multiple networks adversarially becomes overburden. Moreover, the naive ensemble faces dilemma on choosing target model to generate adversarial examples with. Training adversarial examples of the members causes covariate shift, while training those of ensemble diminishes the benefit of ensembling. With these insights, we adopt stochastic weight average methods and improve it by considering overfitting nature of adversarial training. Our method take the benefit of ensemble while avoiding the described problems. Experiments on CIFAR10 and CIFAR100 shows our method improves the robustness effectively.
KSP Keywords
Adversarial Training, Classification problems, Covariate shift, Deep neural network(DNN), Ensemble method, Multiple network, large gap, neural network(NN), target model