ETRI-Knowledge Sharing Plaform



논문 검색
구분 SCI
연도 ~ 키워드


학술대회 Investigation of Visual Self-Representation for a Walking-in-Place Navigation System in Virtual Reality
Cited 7 time in scopus Download 9 time Share share facebook twitter linkedin kakaostory
박찬호, 장경호
Conference on Virtual Reality and 3D User Interfaces (VR) 2019, pp.1114-1115
19CS1200, 사용자 참여형 문화공간 콘텐츠를 위한 AR 플랫폼 기술개발 , 정성욱
Walking-in-place (WIP) is one of the techniques for navigation in virtual reality (VR), and it can be configured in a limited space with a simple algorithm. Although WIP systems provide a sense of movement, it is important to deliver immersive VR experiences by providing information as similar as possible to walking in the real world. There have been many studies on the WIP technology, but it has rarely been done on visual self-representation of WIP in the virtual environment (VE). In this paper, we describe our investigation of virtual self-representation for application to a WIP navigation system using a HMD and full body motion capture system. Our system is designed to move in the pelvis direction by calculating the inertial sensor data, and a virtual body that is linked to the user's movement is seen from the first-person perspective (1PP) in two ways: (i) full body, and (ii) full body with natural walking. In (ii), when a step is detected, the motion of the lower part of the avatar is manipulated as if the user is performing real walking. We discuss the possibility of visual self-representation for the WIP system.
KSP 제안 키워드
First-person, Full Body, Immersive VR, Inertial sensors, Limited space, Motion capture system, Natural walking, Real-walking, Real-world, Self-representation, Simple algorithm