ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Investigation of Visual Self-Representation for a Walking-in-Place Navigation System in Virtual Reality
Cited 7 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Chanho Park, Kyungho Jang
Issue Date
2019-03
Citation
Conference on Virtual Reality and 3D User Interfaces (VR) 2019, pp.1114-1115
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/VR.2019.8798345
Abstract
Walking-in-place (WIP) is one of the techniques for navigation in virtual reality (VR), and it can be configured in a limited space with a simple algorithm. Although WIP systems provide a sense of movement, it is important to deliver immersive VR experiences by providing information as similar as possible to walking in the real world. There have been many studies on the WIP technology, but it has rarely been done on visual self-representation of WIP in the virtual environment (VE). In this paper, we describe our investigation of virtual self-representation for application to a WIP navigation system using a HMD and full body motion capture system. Our system is designed to move in the pelvis direction by calculating the inertial sensor data, and a virtual body that is linked to the user's movement is seen from the first-person perspective (1PP) in two ways: (i) full body, and (ii) full body with natural walking. In (ii), when a step is detected, the motion of the lower part of the avatar is manipulated as if the user is performing real walking. We discuss the possibility of visual self-representation for the WIP system.
KSP Keywords
First-person, Full Body, Immersive VR, Inertial sensors, Limited space, Motion capture system, Natural walking, Real-walking, Real-world, Self-representation, Simple algorithm