ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Local Attention Neural Network for Median Filtering: Enhancing Nonlinear Local Operations
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Soonchul Jung, Jae Woo Kim, Yoon-Seok Choi, Jin-Seo Kim
Issue Date
2024-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2024, pp.1-4
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC62082.2024.10827697
Abstract
Recent advancements in large-scale neural networks have achieved performance levels nearly indistinguishable from human capabilities across various domains, such as image recognition, natural language processing, and game playing. Despite their impressive capabilities, it is known that even the most powerful neural networks cannot directly execute basic arithmetic operations without specific architectural design. The median filter is a nonlinear, local processing technique widely used in image processing for noise reduction. Unlike simple arithmetic operations, the median filter involves sorting pixel values within a neighborhood, which requires logical operations in addition to numerical calculations. Surprisingly, research on implementing such nonlinear local operations within neural networks is relatively sparse. Self-attention, a core component of modern transformer models, enables the network to focus on specific parts of the input data. In this paper, we propose a novel neural network architecture that implements the median filtering operation by locally applying self-attention mechanisms. By leveraging self-attention, our approach effectively mimics the sorting and selection process inherent to the median filter. Our experimental results demonstrated that this architecture surpasses the performance of previous convolutional neural networks specifically designed to implement the median filter, achieving 100% accuracy on a subset of random image data. However, the model did not consistently achieve perfect accuracy across all input types. This suggests that while our approach significantly improves performance, it does not yet flawlessly replicate the median filter's functionality.
KSP Keywords
Architectural Design, Arithmetic operations, Attention mechanism, Convolution neural network(CNN), Human capabilities, Image data, Image processing(IP), Image recognition, Large-scale Neural Networks, Median Filtering, Natural Language Processing(NLP)