ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System
Cited 17 time in scopus Download 23 time Share share facebook twitter linkedin kakaostory
저자
김진우, 류재홍, 한태만
발행일
201508
출처
ETRI Journal, v.37 no.4, pp.793-803
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.15.0114.0076
협약과제
14MC1300, ICT기반 차량/운전자 협력자율주행 시스템(Co-Pilot)의 판단/제어 기술 개발, 한우용
초록
We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.
키워드
GENIVI, Gesture recognition, HMI, Human-machine interface, In-vehicle infotainment, IVI, MMI, Multimodal interface, Multitasking, Speech recognition, UI, User experience, User interface, UX
KSP 제안 키워드
Human-machine interaction(HMI), Interface-based, Machine Interface, Multimodal interface, User interface, hand Gesture recognition, in-vehicle infotainment, simple method, speech recognition, user experience