ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Media Orchestration Between Streams and Devices via New MPEG Timed Metadata
Cited 0 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
저자
M. Oskar van Deventer, Jean-Claude Dufourd, 오세진, 임성용, 임영권, Krishna Chandramouli, Rob Koenen
발행일
201811
출처
SMPTE Motion Imaging Journal, v.127 no.10, pp.32-38
ISSN
1545-0279
출판사
SMPTE
DOI
https://dx.doi.org/10.5594/JMI.2018.2870019
협약과제
18ZR1100, 초실감 공간미디어 원천기술 개발, 서정일
초록
The proliferation of new capabilities in affordable smart devices capable of capturing, processing, and rendering audiovisual media content triggers a need for coordination and orchestration between these devices and their capabilities and of the content flowing from and to such devices. The upcoming Moving Picture Experts Group (MPEG) Media Orchestration standard (MORE, ISO/IEC 23001-13) enables the temporal and spatial orchestration of multiple media and metadata streams. The temporal orchestration is about time synchronization of media and sensor captures, processing, and renderings, for which the MORE standard uses and extends a Digital Video Broadcasting standard. The spatial orchestration is about the alignment of (global) position, altitude, and orientation for which the MORE standard provides dedicated timed metadata. Other types of orchestration involve timed metadata for the region of interest, perceptual quality of media, audio-feature extraction, and media timeline correlation. This paper presents the status of the MORE standard as well as the associated technical and experimental support materials. We also link MORE to the recently initiated MPEG immersive project.
KSP 제안 키워드
Audiovisual media, Digital video broadcasting, Experimental support, Feature extractioN, Media orchestration, Moving picture experts group(MPEG), Perceptual Quality, Region Of Interest(ROI), Smart devices, Time synchronization, media content