ETRI-Knowledge Sharing Plaform



논문 검색
구분 SCI
연도 ~ 키워드


학술대회 4D Effect Video Classification with Shot-aware Frame Selection and Deep Neural Networks
Cited 6 time in scopus Download 8 time Share share facebook twitter linkedin kakaostory
시아다리, 윤현진, 한미경
International Conference on Computer Vision Workshops (ICCVW) 2017, pp.1148-1155
17ZF1200, 실감콘텐츠 산업 활성화를 위한 XD미디어 핵심 기술 개발, 한미경
A 4D effect video played at cinema or other designated places is a video annotated with physical effects such as motion, vibration, wind, flashlight, water spray, and scent. In order to automate the time-consuming and labor-intensive process of creating such videos, we propose a new method to classify videos into 4D effect types with shot-aware frame selection and deep neural networks (DNNs). Shot-aware frame selection is a process of selecting video frames across multiple shots based on the shot length ratios to subsample every video down to a fixed number of frames for classification. For empirical evaluation, we collect a new dataset of 4D effect videos where most of the videos consist of multiple shots. Our extensive experiments show that the proposed method consistently outperforms DNNs without considering multi-shot aspect by up to 8.8% in terms of mean average precision.
KSP 제안 키워드
Deep neural network(DNN), Empirical Evaluation, Frame selection, Multi-shot, Video classification, Water spray, mean average precision, new method, physical effects, video frames