Explore chapters and articles related to this topic
Unmanned Aerial System (UAS) Applications in the Built Environment
Published in David R. Green, Billy J. Gregory, Alex R. Karachok, Unmanned Aerial Remote Sensing, 2020
Photos taken during our test drone flight were used in the program Pix4D to generate a 3D point cloud from the 2D images. Multiple models may be recommended based on the number of photos taken. Based on empirical trials and observations, approximately 1,000–1,300 photos are recommended as a suitable number of photos for one simulation, due to current software limitations that may be resolved in the future. Multiple models may be merged upon completion, and this allows the operator to eliminate photos that may not calibrate properly. The program extracts pixels from 2D images by triangulation and locates individual pixels from photos within a 3D point cloud model. During this process, images of inferior quality and images that did not capture the subject will be rejected by the program. Upon generation of the model, we observed that the façade that produced the most detailed model is consistently located on the southern face of the inspected building. Using Pix4D modeler, the program runs a 15-point process that results in a report output noting the efficiency of the photos used in generating the model, image overlap, and location, among other variables. This output notes weaknesses of the image retrieval process and should influence future flights. The model can be processed in other 3D modelling and CAD software such as Rhino3D. When opened in a .FBX file format, the 3D model will maintain render and texture capabilities. The model is compatible and can be used in rendering and 3D printing (Figure 14.7).
Development of the bridge inspection experience system with MR head-mounted display
Published in Hiroshi Yokota, Dan M. Frangopol, Bridge Maintenance, Safety, Management, Life-Cycle Sustainability and Innovations, 2021
Y. Baba, H. Emoto, S. Tanikawa, H. Nakamura, K. Kawamura
After making the 3D-VR model, to keep the relationship between the polygon data and the bridge texture image data of in the Unity system, a 3D-VR model by Metasequoia is imported. The polygon data is the shape of an object. Therefore, a 3D-VR model is serialized by the extension type of FBX (.fbx) (Autodesk Inc. 2019). This file format can smoothly exchange 3D data developed by “Kaydara filmbox”. Currently, the FBX format is licensed by Autodesk. Keeping the relationship of the polygon data and image data, the Unity system can import a 3D-VR model.
Use of gaming technology to bring bridge inspection to the office
Published in Structure and Infrastructure Engineering, 2019
Muhammad Omer, Lee Margetts, Mojgan Hadi Mosleh, Sam Hewitt, Muhammad Parwaiz
In order to view architectural models and point clouds in the VR headset, a number of steps are required. Architectural models can be created using AutoCAD 2017 and exported as a .fbx file; a proprietary format for 3D models. To import a point cloud, a script is used to read the point cloud (saved in .ply file format) and then each point is plotted within Unity. Once the architectural model and/or point cloud is loaded into Unity’s scene, the initial position of the camera is set manually. Extra coding is required to enable the user to navigate in the scene using a button on the bluetooth controller. Once the scene creation has been completed, specific settings are chosen in Unity to be able to export the scene within an app for viewing on the VR headset. For example, Unity needs to know that it is building an app for the Android operating system so that it is compatible with the Samsung Galaxy S6 smart phone. Furthermore, additional plugins such as the Android SDK Manager and Java Runtime need to be installed in Unity.