ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Unsupervised Bidirectional Style Transfer Network using Local Feature Transform Module
Cited 4 time in scopus Download 109 time Share share facebook twitter linkedin kakaostory
Authors
Kangmin Bae, Hyung-Il Kim, Yongjin Kwon, Jinyoung Moon
Issue Date
2023-06
Citation
Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2023, pp.740-749
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/CVPRW59228.2023.00081
Abstract
In this paper, we propose a bidirectional style transfer method by exchanging the style of inputs while preserving the structural information. The proposed bidirectional style transfer network consists of three modules: 1) content and style extraction module that extracts the structure and style-related features, 2) local feature transform module that aligns locally extracted feature to its original coordinate, and 3) reconstruction module that generates a newly stylized image. Given two input images, we extract content and style information from both images in a global and local manner, respectively. Note that the content extraction module removes style-related information by compressing the dimension of the feature tensor to a single channel. The style extraction module removes content information by gradually reducing the spatial size of a feature tensor. The local feature transform module exchanges the style information and spatially transforms the local features to its original location. By substituting the style information with one another in both ways (i.e., global and local) bidirectionally, the reconstruction module generates a newly stylized image without diminishing the core structure. Furthermore, we enable the proposed network to control the degree of style to be applied when exchanging the style of inputs bidirectionally. Through the experiments, we compare the bidirectionally style transferred results with existing methods quantitatively and qualitatively. We show generation results by controlling the degree of applied style and adopting various textures to an identical structure.
KSP Keywords
Content Extraction, Core structure, Feature transform, Global and local, Local feature, Single Channel, Structural information, Style transfer, Transfer method, Transfer network