ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 레벨3 자율주행차량에서 운전자의 제어권전환 인지 및 행동 반응 연구
Cited - time in scopus Download 19 time Share share facebook twitter linkedin kakaostory
저자
김현숙, 김우진, 김정숙, 이승준, 권오천, 윤대섭
발행일
202012
출처
대한인간공학회지, v.39 no.6, pp.585-596
ISSN
1229-1684
출판사
대한인간공학회
DOI
https://dx.doi.org/10.5143/JESK.2020.39.6.585
협약과제
20IR1400, 자율주행자동차(SAE 레벨 2,3) 기반 인적요인 심층 연구, 윤대섭
초록
Objective: The purpose of this research is to conduct an experiment on the human factor of control authority transition and analyze the performance of control authority transition by using a vehicle simulator to support the driver's take-over mechanism in level 3 automated vehicles.Background: How to inform the transition of control to manual driving, the driver's NDRTs (Non-Driving Related Tasks), the driver's age, and driving experience can affect the quality and timing of manual driving re-engagement. Therefore, research on human factors for control authority transition is needed in level 3 automated vehicles.Method: We conducted experiments to identify how visual and cognitive workloads, pre-cue for attention shifts, driving situation information, modality types that provide TOR (Take Over Request) information, driving readiness, and driver's secondary task types affect control authority transition.Results: In this study, we found that pre-cue or driving situation information is provided before TOR, the performance for driver's take-over is improved. When pre-cue is provided auditory (4.25s), the time to recognize TOR is significantly faster than when it is provided visually (6.25s). We found that the case of providing driving situation awareness information (3.19s) is faster than the case of not providing (3.96s).We also identified that the performance for driver's take-over is improved when haptic interactions are added to provide TOR information. Adding a haptic modality (3.75s) to an auditory interaction to provide a TOR notification has been observed to have a much faster TOR recognition time than adding a visual modality (4.57s). In addition, we found that the greater the cognitive, visual, auditory, and hand-based physical demands related to the type of secondary tasks (NDRTs), the greater the workload felt by the driver, so it takes a long time to recognize the TOR and the performance for control authority transitions is lowered. The TOR recognition time was found to be faster in cases of looking ahead or around (2.74s) than in the case of drinking task (3.12s) or texting (3.23s).Conclusion: The level 3 automated vehicles must manage the driver's readiness to drive at any time so that the driver can regain control from the ADS and engage in driving. To this end, it is necessary to continuously develop a driver monitoring system and related technologies that measure the driver's gaze, hand movement, in-vehicle conversation, and seating information in real-time.Application: The results of this research can be used for the development of guidelines and commercialization policies that can be referenced and applied by Level 3 automated vehicle companies and organizations related to automated driving.
KSP 제안 키워드
Automated driving, Automated vehicles, Driver Monitoring System, Driving experience, Haptic Interaction, Human Factors, In-vehicle, Level 3, Long Time, Non-driving related tasks(NDRT), Physical demands