Explore chapters and articles related to this topic
Computer-Based Presentation Systems
Published in Paul W. Ross, The Handbook of Software for Engineers and Scientists, 2018
The palette specifies the foreground and background color that is being used at any point in time. The chosen colors are shown in the selected colors box. To change the foreground color: click on the color in the color palette with the left mouse button.To change the background color: click on the color palette with the right mouse button.
Phased array ultrasonic imaging and characterization of adhesive bonding between thermoplastic composites aided by machine learning
Published in Nondestructive Testing and Evaluation, 2023
Guanyu Piao, Jorge Mateus, Jiaoyang Li, Ranjit Pachha, Parvinder Walia, Yiming Deng, Sunil Kishore Chakrapani
The PAUT system presented in Section 2 was used to collect the data from the fabricated samples with the different bonding conditions. Each sample was scanned twice using the PAUT transducer. Two examples of raw C-Scan images from the control sample and the sample T2 are shown in Figures 3(a) 3 3(b), respectively. The adhesion area denoted by blue/white colour can be easily identified from the C-Scan data with a high SNR. Based on the default colour palette of the acquisition software, the white colour indicates good adhesive bonding, i.e. low reflection, while the blue colour indicates mid-level adhesive bonding conditions and the red colour indicates disbonds or defects due to high reflection coefficient.
Designing of an inflammatory knee joint thermogram dataset for arthritis classification using deep convolution neural network.
Published in Quantitative InfraRed Thermography Journal, 2022
Shawli Bardhan, Satyabrata Nath, Tathagata Debnath, Debotosh Bhattacharjee, Mrinal Kanti Bhowmik
The captured thermograms are in Rainbow palette where each pixel represents the inflammation with Red (R), Green (G), and Blue (B) channel value as shown in Figures 4 and Figures 5(a). In the pre-processing step, the Rainbow palette is converted into grey palette format using the FLIR tool software. In grey palette also, the inflammation is represented three channels with equal intensity value and this representation is responsible for a visually grey view of thermogram with 24-bit intensity value. Therefore, individually any of the channels with 8-bit intensity value is identical to the 24-bit grey palette thermogram. The advantage of using a grey palette is the proportionality of the temperature value with intensity. In the grey palette, with the increase of temperature, the intensity also increases, and the highest temperature is represented with the highest intensity value within the thermogram, and the reverse representation is followed for the lowest temperature. In knee thermogram dataset, the acquisition is focused on the knee region, but the thermogram also consists of inflammation-oriented information generated from the unwanted leg portions. Also, thermograms consist of information related to the company logo, temperature scale, tags, etc. This information generates complication in thermogram processing and provides erroneous results of the analysis. Therefore, in preprocessing, manually the knee region is extracted from the whole thermogram through rectangular cropping as shown in Figure 5. The manually cropped knee region-oriented grey palette thermogram is the final output of the preprocessing step and used as the input of the further steps of arthritis classification.
Assessing tree crown fire damage integrating linear spectral mixture analysis and supervised machine learning on Sentinel-2 imagery
Published in International Journal of Digital Earth, 2023
Giandomenico De Luca, Giuseppe Modica, João M. N. Silva, Salvatore Praticò, José M.C. Pereira
The FIs generated by employing the FCLS spectral mixture model to the images acquired immediately after the fire (2018 for PT, 2021 for IT) and representing the proportion of each of the four components (%ch, FI-1; %sc, FI-2; %gr, FI-3; %bs, FI-4) are shown in Figures 5 (LibrEnds_PT), 5 (ImgEnds_PT), 6 (LibrEnds_IT), and 7 (ImgEnds_IT). The figures illustrate a portion of the scene where all four components are clearly observable. The value of each pixel is directly associated with the proportions (abundance) of each of the four respective endmembers selected through the PPI index in a grayscale range normalized between 0 (black) and 1 (white) to improve visualization. This is more evident in the RGB false-color image (Figures 5–8, lower left corner), in which the three fire-related FIs were combined (Red = FI-2, %sc; Green = FI-4, %gr; Blue = FI-1, %ch). In PT (Figures 5 and 6), the prevalence of the %ch component in ImgEnds FIs, compared to those retrieved from LibrEnds in terms of proportion and occupied surface, is perceptible by the grayscale palette, where the higher the proportion, the brighter the pixel color. Notably, in LibrEnds, the %ch component is totally excluded from the surfaces covered by unburned and scorched forest vegetation, unlike in ImgEnds, where %ch abundance seems equal to the under-cover bare soil. This was predictable, considering the similarity of the EM3 and EM1 spectral signatures (Figure 4). FI-2, hence the %sc component, is correctly marked on the forest vegetation in ImgEnds, while it appears to be slightly confused with the soil in LibrEnds. The green component appeared moderately distributed even among the scorched forest cover in LibrEnds, while concentrated in specific zones in ImgEnds.