ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Full-parallax Virtual View Image Synthesis using Image-based Rendering for Light- eld Content Generation
Cited 0 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
저자
박영수, 신홍창, 이광순, 정원식, 허남호
발행일
201705
출처
SPIE Commercial + Scientific Sensing and Imaging 2017 (SPIE 10219), v.10219, pp.1-12
DOI
https://dx.doi.org/10.1117/12.2264596
협약과제
16MB2200, 모바일 완전입체 단말 및 콘텐츠 기술 개발, 황치선
초록
Light-field content is required to provide full-parallax 3D view with dense angular resolution. However, it is very hard to directly capture such dense full-parallax view images using a camera system because it requires specialised micro-lens arrays or a heavy camera-array system. Therefore, we present an algorithm to synthesise full-parallax virtual view images using image-based rendering appropriate for light-field content generation. The proposed algorithm consists of four-directional image warping, view image blending using the nearest view image priority selection and the sum of the weighted inverse Euclidean distance, and hole filling. Experimental results show that dense full-parallax virtual view images can be generated from sparse full-parallax view images with fewer image artefacts. Finally, it is confirmed that the proposed full-parallax view synthesis algorithm can be used for light-field content generation without a dense camera array system.
KSP 제안 키워드
Camera system, Directional image, Hole filling, Image Based Rendering, Micro-lens, angular resolution, camera array, content generation, euclidean distance, image blending, image synthesis