Explore chapters and articles related to this topic
Information and communication technology/Building Information Modeling (BIM)
Published in Lincoln H. Forbes, Syed M. Ahmed, Lean Project Delivery and Integrated Practices in Modern Construction, 2020
Lincoln H. Forbes, Syed M. Ahmed
Similar to basic computing in the 1960s, MR has traditionally been reserved for entities holding both specialized equipment and personnel such as the oil and gas industry or medicine. However, recent advancements in this field, such as the Microsoft HoloLens®, have simplified MR hardware to the point where almost anyone can take advantage; no specialized knowledge is necessary to perform most tasks (DeValle and Azhar 2018). The HoloLens® is a self-contained head-mounted display with specialized components such as a HoloLens Processing Unit (HPU) and advanced sensors to enable holographic computing. These components enable HoloLens users to engage with digital content and interact with holograms in the world around them. The first generation HoloLens® was released by Microsoft in 2016 while the second generation HoloLens® 2 was scheduled for release in July 2019. More information about the HoloLens® can be found at: www.microsoft.com/en-us/hololens.
Effects of Optical See-Through Head-Mounted Display Use for Simulated Laparoscopic Surgery
Published in International Journal of Human–Computer Interaction, 2023
Yaoyu Fu, Steven D. Schwaitzberg, Lora Cavuoto
The Microsoft HoloLens 1 (Microsoft, Redmond, WA, USA) was used as the OST-HMD in this study. The HoloLens 1 has a 2–3 h long battery life (Gsaxner et al., 2023), allowing for untethered use for the whole session. The HoloLens scenario was developed with the Mixed Reality Toolkit (MRTK) v2.00 and Unity3D (2018.4.6f1). The 2D camera feed of the FLS trainer box (laparoscopic view) was transferred to the computer by an RCA cable and an RCA to USB adapter. The primary screen in the HL + HL and HL + M settings was designed as a similar size and distance as the monitor in the M + M setting. The vital signs monitor was developed with Pulse Physical Engine Unity Asset by Kitware (Kitware, Inc., Carrboro, NC, USA). The automatic start/stop of the adverse events and the automatic pauses were written in the C# script. The interactive buttons were developed with the button prefabs in MRTK. The click action corresponded with the Pulse Physical Engine, which allowed the vital signs to change according to the participants’ actions. For settings that used the HoloLens, the final scenario was streamed through Holographic Remoting (Microsoft, 2022) with Wi-Fi. The video codec was H264 (Microsoft, 2023) and the video was collected at 1920 × 1080 resolution.
Effect of Transparency Levels and Real-World Backgrounds on the User Interface in Augmented Reality Environments
Published in International Journal of Human–Computer Interaction, 2023
Muhammad Hussain, Jaehyun Park
The AR device used in this study was Microsoft HoloLens-2 (Microsoft 2022). It includes optical see-through holographic lenses with 2k 3:2 light engine resolution. The field of view of HoloLens-2 was 3˚ 50′′. The holographic density was greater than 2.5k radiant (light points per radian). Hand tracking was used to interact with HoloLens, and the prototype was created in Unity 3D using C# scripting. A seminar room is used as an experimental room for this study with a controlled lighting condition of 790–800 lux level during daylight hours (Figure 1). The lighting in the experimental room was measured using a lux meter and kept the same for all experimental conditions. The brightness levels were monitored during the experiment, and the HoloLens 2 Level 5 setting was applied. The black screen was used to control the background color effect during the experiment for uniform conditions. The prototype consists of nine square buttons (3 × 3 array). The distance between the virtual object and the eye was 120 cm, and the field of view of the buttons was 3° 50ʺ (Figure 2). The color model considered for the prototype was RGBA with R: 28, G: 207, and B: 30, whereas five conditions according to the value of the alpha channel (0, 0.25, 0.5, 0.75, and 1) were considered to analyze the color transparency. The color code in terms of HSV space was H: 121˚, S: 86.5%, and V:81.2%. A color with an alpha value of 0 is completely transparent, whereas a value of 1 is solid (opaque) (Kia et al., 2021).
Interaction Strategies for Effective Augmented Reality Geo-Visualization: Insights from Spatial Cognition
Published in Human–Computer Interaction, 2021
Aaron L. Gardony, Shaina B. Martis, Holly A. Taylor, Tad T. Brunyé
AR refers to the addition of virtual objects into the real world that appear to coexist in the same space as the user (Azuma, 1997). This contrasts with VR where virtual objects are displayed within a virtual (rather than real) environment (Milgram & Kishino, 1994). Both technologies allow for interactive visualization of 3D content, with the former being more recently applied in an untethered head-worn format. The so-called standalone head-worn AR systems neither connect to nor depend on an external computer for processing, permitting their use in a variety of mobile contexts. Of the available head-worn consumer-grade AR technology available today, the Microsoft HoloLens stands out as the most mature, providing rich 3D graphical content, markerless inside-out tracking, wireless network connectivity enabling collaborative AR experiences, and support for gestural and voice command interaction with digital content in a self-contained headset. These features have made HoloLens an attractive tool for researchers investigating the utility, usability, and cognitive impact of standalone AR systems. Indeed, recent research has used the HoloLens to investigate a variety of use cases, including driving (Kun, van der Meulen, & Janssen, 2017), manual assembly (Blattgerste, Strenge, Renner, Pfeiffer, & Essig, 2017), network operations monitoring (Beitzel et al., 2016; Beitzel, Dykstra, Toliver, & Youzwak, 2018), data visualization (Hockett & Ingleby, 2016; Saenz, Baigelenov, Hung, & Parsons, 2017), and medical imaging (Cui, Kharel, & Gruev, 2017; Hackett & Proctor, 2016; Karmonik, Boone, & Khavari, 2017).