Explore chapters and articles related to this topic
A framework for introducing emerging technologies in design studio classes
Published in Rita Almendra, João Ferreira, Research & Education in Design: People & Processes & Products & Philosophy, 2020
Most graduate students apply the technology they select to their previously identified research topics. Undergraduates apply the technology they choose towards the needs of the research institute facility. The undergraduate students are introduced in a hands-on workshop to several technologies unfamiliar to them that can also optionally be used for prototyping. WebXR (“WebXR”, n.d.) enables browser-based distribution of VR/AR experiences. Students are specifically introduced to Sketchfab (“Sketchfab”, n.d.), A-Frame (“A-Frame”, n.d.), and Glitch (“Glitch”, n.d.), as well as 360-degree spherical photography. A few additional constraints are given to the undergraduates to provide additional guidance. Their prototyping must include video showing classmates interacting with their simulated systems. They also need to design both a physical component and an online accessible component. It is emphasized that it is completely acceptable to fail to get a technology working successfully, as long as their investigations and learning are thoroughly documented.
Interactive, in-browser cinematic volume rendering of medical images
Published in Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 2023
Jiayi Xu, Gaspard Thevenon, Timothee Chabat, Matthew McCormick, Forrest Li, Tom Birdsong, Ken Martin, Yueh Lee, Stephen Aylward
Just as consumer (gaming) adoption of GPUs in the 1990s was the catalyst for 3D medical volume rendering becoming clinically routine, we anticipate that the consumer adoption of XR systems will spur their clinical use over the next few years. Therefore, we have been advancing the visualisation algorithms necessary for medical XR applications in the open-source visualisation libraries: VTK (the visualisation toolkit, www.vtk.org) and vtk.js (VTK rewritten in JavaScript for in-browser applications, https://github.com/kitware/vtk-js). We anticipate that XR systems will increasingly build upon web technologies, such as vtk.js, as exhibited by the consortium of businesses rallying behind the WebXR standard (Kokelj et al. 2018). We envision a future with WebXR dominating the medical XR domain, including, for example, for digital twin and surgical simulation applications (Noguera and Jiménez 2016).
Virtual Reality System for Monte Carlo Transport Simulation Codes Using Web Technology and Commodity Devices
Published in Nuclear Science and Engineering, 2023
WebXR is a standard for realizing VR and augmented reality (AR) on a web browser; consequently, VR or AR applications can be created without depending on the operating system (OS) environment by using a browser that supports this standard. Specifically, VR web applications can obtain gyro sensors and position tracking information from devices using the WebXR API. In a usual 3D viewer application, some 3D objects and a camera are set, and then the image seen from the camera is rendered and sent to a display. VR applications differ from usual applications in that they send different viewpoint images to the two displays (left eye and right eye) and the viewpoints are moved with the head-mounted display (HMD) synchronously. The @react-three/xr library associates WebXR’s XRViewport with the Three.js canvas, and the HMD motion is automatically reflected in the camera position in the application by the library. For controllers, the VR-specific actions, “select” and “grab,” events are managed by the @react-three/xr component while other actions, such as controller buttons and sticks, need to be processed by calling the WebXR Gamepad API directly. Specifically, the joystick and button states of the controller are acquired at each frame (every canvas refresh time), and the camera position and object status are changed if the joystick is down or the buttons are pushed. Figure 7 shows the name of each part of the controller, and Table I lists the operations supported by Gxsview-web. Gxsview-web supports six degrees of freedom (6DoF) devices, which is an HMD, and handheld controllers with position tracking, such as Oculus Quest2 (Ref. 26). The support for three degrees of freedom (3DoF) devices is incomplete, and it should be noted that it is not possible to move from the origin when accessed with a 3DoF device that does not have a handheld controller, such as Google Cardboard.27 The difference between the 6DoF and the 3DoF devices is shown in Figs. 8a and 8b.