ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Few-Shot Anomaly Detection with Adversarial Loss for Robust Feature Representations
Cited - time in scopus Download 72 time Share share facebook twitter linkedin kakaostory
Authors
Jae Young Lee, Wonjun Lee, Jaehyun Choi, Yongkwi Lee, Young Seog Yoon
Issue Date
2023-11
Citation
British Machine Vision Conference (BMVC) 2023, pp.1-13
Publisher
BMVA 
Language
English
Type
Conference Paper
Abstract
Anomaly detection is a critical and challenging task that aims to identify data points deviating from normal patterns and distributions within a dataset. Various methods have been proposed using a one-class-one-model approach, but these techniques often face practical problems such as memory inefficiency and the requirement of sufficient data for training. In particular, few-shot anomaly detection presents significant challenges in industrial applications, where limited samples are available before mass production. In this paper, we propose a few-shot anomaly detection method that integrates adversarial training loss to obtain more robust and generalized feature representations. We utilize the adversarial loss previously employed in domain adaptation to align feature distributions between source and target domains, to enhance feature robustness and generalization in few-shot anomaly detection tasks. We hypothesize that adversarial loss is effective when applied to features that should have similar characteristics, such as those from the same layer in a Siamese network’s parallel branches or input-output pairs of reconstructionbased methods. Experimental results demonstrate that the proposed method generally achieves better performance when utilizing the adversarial loss.
KSP Keywords
Adversarial Training, Detection Method, Feature representation, Robust feature, Siamese network, anomaly detection, domain adaptation, feature distribution, industrial applications, input and output, mass production