ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Media Orchestration Between Streams and Devices via New MPEG Timed Metadata
Cited 2 time in scopus Share share facebook twitter linkedin kakaostory
Authors
M. Oskar van Deventer, Jean-Claude Dufourd, Sejin Oh, Seong Yong Lim, Youngkwon Lim, Krishna Chandramouli, Rob Koenen
Issue Date
2018-11
Citation
SMPTE Motion Imaging Journal, v.127, no.10, pp.32-38
ISSN
1545-0279
Publisher
SMPTE
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.5594/JMI.2018.2870019
Abstract
The proliferation of new capabilities in affordable smart devices capable of capturing, processing, and rendering audiovisual media content triggers a need for coordination and orchestration between these devices and their capabilities and of the content flowing from and to such devices. The upcoming Moving Picture Experts Group (MPEG) Media Orchestration standard (MORE, ISO/IEC 23001-13) enables the temporal and spatial orchestration of multiple media and metadata streams. The temporal orchestration is about time synchronization of media and sensor captures, processing, and renderings, for which the MORE standard uses and extends a Digital Video Broadcasting standard. The spatial orchestration is about the alignment of (global) position, altitude, and orientation for which the MORE standard provides dedicated timed metadata. Other types of orchestration involve timed metadata for the region of interest, perceptual quality of media, audio-feature extraction, and media timeline correlation. This paper presents the status of the MORE standard as well as the associated technical and experimental support materials. We also link MORE to the recently initiated MPEG immersive project.
KSP Keywords
Audiovisual media, Digital video broadcasting, Experimental support, Feature extractioN, Media orchestration, Moving picture experts group(MPEG), Perceptual Quality, Region of interest(ROI), Smart devices, Time synchronization, media content