ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Adversarial Training With Stochastic Weight Average
Cited 3 time in scopus Download 5 time Share share facebook twitter linkedin kakaostory
저자
황중원, 이영완, 오성찬, 배유석
발행일
202109
출처
International Conference on Image Processing (ICIP) 2021, pp.814-818
DOI
https://dx.doi.org/10.1109/ICIP42928.2021.9506548
협약과제
20HS5100, (딥뷰-1세부) 실시간 대규모 영상 데이터 이해·예측을 위한 고성능 비주얼 디스커버리 플랫폼 개발, 배유석
초록
Although adversarial training is the most reliable method to train robust deep neural networks so far, adversarially trained networks still show large gap between their accuracies on clean images and those on adversarial images. In conventional classification problem, one can gain higher accuracy by ensembling multiple networks. However, in adversarial training, there are obstacles to adopt such ensemble method. First, as inner maximization is expensive, training multiple networks adversarially becomes overburden. Moreover, the naive ensemble faces dilemma on choosing target model to generate adversarial examples with. Training adversarial examples of the members causes covariate shift, while training those of ensemble diminishes the benefit of ensembling. With these insights, we adopt stochastic weight average methods and improve it by considering overfitting nature of adversarial training. Our method take the benefit of ensemble while avoiding the described problems. Experiments on CIFAR10 and CIFAR100 shows our method improves the robustness effectively.
KSP 제안 키워드
Adversarial Training, Classification problems, Covariate shift, Deep neural network(DNN), Ensemble method, Large gap, Multiple network, target model