ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System
Cited 23 time in scopus Download 38 time Share share facebook twitter linkedin kakaostory
Authors
Jinwoo Kim, Jae Hong Ryu, Tae Man Han
Issue Date
2015-08
Citation
ETRI Journal, v.37, no.4, pp.793-803
ISSN
1225-6463
Publisher
한국전자통신연구원 (ETRI)
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.15.0114.0076
Abstract
We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.
KSP Keywords
Interface-based, Multimodal interface, hand Gesture recognition, in-vehicle infotainment, simple method, speech recognition