ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Full-parallax Virtual View Image Synthesis using Image-based Rendering for Light- eld Content Generation
Cited 0 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
Authors
Youngsoo Park, Hong-chang Shin, Gwangsoon Lee, Won-sik Cheong, Namho Hur
Issue Date
2017-05
Citation
SPIE Commercial + Scientific Sensing and Imaging 2017 (SPIE 10219), v.10219, pp.1-12
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1117/12.2264596
Project Code
16MB2200, Full 3D mobile device & contents, Hwang Chi-Sun
Abstract
Light-field content is required to provide full-parallax 3D view with dense angular resolution. However, it is very hard to directly capture such dense full-parallax view images using a camera system because it requires specialised micro-lens arrays or a heavy camera-array system. Therefore, we present an algorithm to synthesise full-parallax virtual view images using image-based rendering appropriate for light-field content generation. The proposed algorithm consists of four-directional image warping, view image blending using the nearest view image priority selection and the sum of the weighted inverse Euclidean distance, and hole filling. Experimental results show that dense full-parallax virtual view images can be generated from sparse full-parallax view images with fewer image artefacts. Finally, it is confirmed that the proposed full-parallax view synthesis algorithm can be used for light-field content generation without a dense camera array system.
KSP Keywords
Camera system, Directional image, Hole filling, Image Based Rendering, Micro-lens, angular resolution, camera array, content generation, euclidean distance, image blending, image synthesis