ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 View synthesis with sparse light field for 6DoF immersive video
Cited 6 time in scopus Download 69 time Share share facebook twitter linkedin kakaostory
저자
곽상운, 윤정일, 정준영, 김영욱, 임인성, 정원식, 서정일
발행일
202202
출처
ETRI Journal, v.44 no.1, pp.24-37
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.2021-0205
협약과제
21HH7300, [통합과제] 초실감 테라미디어를 위한 AV부호화 및 LF미디어 원천기술 개발, 최진수
초록
Virtual view synthesis, which generates novel views similar to the characteristics of actually acquired images, is an essential technical component for delivering an immersive video with realistic binocular disparity and smooth motion parallax. This is typically achieved in sequence by warping the given images to the designated viewing position, blending warped images, and filling the remaining holes. When considering 6DoF use cases with huge motion, the warping method in patch unit is more preferable than other conventional methods running in pixel unit. Regarding the prior case, the quality of synthesized image is highly relevant to the means of blending. Based on such aspect, we proposed a novel blending architecture that exploits the similarity of the directions of rays and the distribution of depth values. By further employing the proposed method, results showed that more enhanced view was synthesized compared with the well-designed synthesizers used within moving picture expert group (MPEG-I). Moreover, we explained the GPU-based implementation synthesizing and rendering views in the level of real time by considering the applicability for immersive video service.
KSP 제안 키워드
Conventional methods, GPU-based, Immersive video, MPEG-I, Motion parallax, Moving picture, Pixel unit, Real-Time, Smooth Motion, Use Cases, binocular disparity
본 저작물은 공공누리 제4유형 : 출처표시 + 상업적 이용금지 + 변경금지 조건에 따라 이용할 수 있습니다.
제4유형