Explore chapters and articles related to this topic
An Evaluation Model for the Design of Virtual Reality Systems
Published in Tugrul Daim, Marina Dabić, Yu-Shan Su, The Routledge Companion to Technology Management, 2023
In the “Meant to be Seen” 3D Forum, Oculus founder Palmer Freeman Luckey displayed a new HMD. At the time, it was a relatively efficient HMD on the market that was also affordable for gamers. Oculus Rift is a type of head-mounted VR device. Oculus Rift was financed and developed by Oculus VR i 2012. The company was acquired by Facebook in March 2014 and the product was launched on March 28, 2016. Oculus Rift was the first consumer-oriented head-mounted VR device on the market and it was the “first professional VR headgear for PC”. In addition to gaming entertainment, Oculus Rift supports media, social networking, and industrial applications (Wikipedia, 2021b).
Pre-Occupancy evaluation of buildings in VR: development of the prototype and user studies
Published in Architectural Science Review, 2022
Anastasia Globa, Rui Wang, Olubukola Tokede, Chin Koi Khoo
Virtual models were built with Rhino 3D and Grasshopper3D, before being exported into the game engine Unity3D. User Interfaces (UI), rendering effects, spatial time-specific soundscape and experiment procedures were then integrated into Unity3D and the results were output to fully-immersive VR devices. Two types of fully immersive VR headsets were adopted in this study: the Oculus Rift headset connected to VR capable computers, and the standalone portable VR headset Oculus Quest. Both headsets were equipped with two motion controllers of the same configuration and provided first-person perspective, state-of-art fully immersive interactive VR experiences. The graphic performance of Oculus Rift was higher compared to the Oculus Quest, while the Quest enabled better free movements as it was not tethered. By using two different types of VR devices we wanted to leverage all VR equipment available in our lab and complete the experiment sessions within the workshop that architecture professionals would attend.
How do atria affect navigation in multi-level museum environments?
Published in Architectural Science Review, 2021
Athina Lazaridou, Sophia Psarra
A critical issue related to navigation is how people scan the environment in order to make route choices (e.g. Haq and Zimring 2003). The relationship between head movement, the direction of gaze and the decisions people perform when moving are of central importance in studying spatial navigation. Specifically, head movement in VR was found to determine participants’ gaze direction (Christou et al. 2016). The field of view in the VR system was 110°, the same as that in the Oculus Rift. Results showed that head movements were crucial, because users had to maintain awareness of their position along the path while evaluating their position in relation to the upcoming junctions (Christou et al. 2016). A study by Bowman et al. (2004) indicated that the large field of view (110°) in VR reduced the amount of head movement allowing users to understand spatial relationships more easily. Similarly, Barton, Valtchanov, and Ellard (2014) proved that the frequency of visitors’ head turns is higher during a navigation task with a limited angle of vision. It seems that studying head movement in VR can bring up valuable results regarding exploration. The present study focuses on head movement in VR, aiming to understand how the volumetric treatment of buildings, which is often purposely designed to aid navigation, impacts on exploration.
Towards the development of an intuitive teleoperation system for human support robot using a VR device
Published in Advanced Robotics, 2020
Jun Nakanishi, Shunki Itadera, Tadayoshi Aoyama, Yasuhisa Hasegawa
Figure 1 shows the configuration of our teleoperation system developed in this paper. As a user interface, we use Oculus Rift/Touch (Facebook Technologies, formerly Oculus VR), which is a commercially available VR headset (Oculus Rift) and a handheld motion controller (Oculus Touch). Figure 2 presents an overview of the system and the motion mapping between the user and the robot, and visual and haptic feedback provided to the user during teleoperation. According to the measured position and orientation of these devices, we control the corresponding DoFs of the robot so that the movement of the robot is synchronized with that of the user. We provide the user with the stereoscopic image around the robot projected on the head-mounted display (HMD) of the Oculus Rift headset and haptic feedback through vibration in the Oculus Touch controller according to the magnitude of the contact forces. The horizontal head position (2 DoFs) and the hand position/orientation (6 DoFs) of the user are mapped with the base and arm DoFs of the robot (total of 8 DoFs). The head pan/tilt motion of the robot is controlled to track the corresponding head orientation (2 DoFs) of the user. The open/close command for the gripper and the start/stop command for the suction pad will be given by a trigger and a button on the Oculus Touch device, respectively. In addition, the horizontal movement of the mobile base is commanded by the thumbstick and the direction of the rotation of the base is commanded by the buttons both via velocity control. By this way, it is possible to perform the coordinated whole-body control of HSR via hand and head movement of the user and navigation of the robot in the environment via joystick control at the same time.