Explore chapters and articles related to this topic
One Factor Designs
Published in Thomas J. Lorenzen, Virgil L. Anderson, Design of Experiments, 2018
Thomas J. Lorenzen, Virgil L. Anderson
The total sum of squares is the sum over all the observations of the square of each individual deviation from the overall mean. That is, take each individual observation, subtract the grand mean, square the result, and then sum over all observations.
Investigating the OSL components and the sensitivity changes in a natural quartz crystal showing an intense post-sensitization TL signal
Published in Radiation Effects and Defects in Solids, 2023
Bruno R. Soares, Fania D. Caicedo Mateus, Viviane K. Asfora, Pedro L. Guzzo
Data evaluation and plotting were carried out using a self-written R script (37) and utilizing numerical functions of the package Luminescence (38,39). Essentially, the analysis of the OSL data consisted of the deconvolution of CW-OSL signals using exponential first-order (i.e. no re-trapping) equations to calculate the photoionization cross-sections. The mathematical transformation of CW-OSL to a pseudo linearly modulated (pLM) signal and its deconvolution were also carried out to check the results obtained with CW-OSL curves. The reliability of the fitting was assessed using the pseudo-R2 parameter from fit_CWCurve and fit_LMCurve functions. This parameter is defined as 1 – RSS/TSS, where RSS is the residual sum of squares and TSS is the total sum of squares.
Thermal decomposition kinetics of synthesized poly(N-isopropylacrylamide) and Fe3O4 coated nanocomposite: Evaluation of calculated activation energy by RSM
Published in Petroleum Science and Technology, 2023
Ersin Pekdemir, Ercan Aydoğmuş, Hasan Arslanoğlu
Root mean square error (RMSE) is the standard deviation for the estimation errors. Predictive errors are a measure of how far the data points are from the regression line. Also, RMSE is a measure of the propagation of these errors that show how dense the data is at the optimal line. Chi-square statistic is commonly used to evaluate tests of independence when using a crosstab (bivariate table). The cross-tabulation presents the intersections of the bivariate distributions simultaneously. If the variables are independent of each other, the independence test evaluates the pattern observed in the cells, that is, whether there is a relationship between the two variables. Residual sum of squares (SSR) is a statistical technique used to measure the amount of variance in a data set that is not explained by the regression model. SSR is also known as the sum of remaining squares, essentially determines how well a regression model describes or represents the data in the model. In statistics, a total sum of squares (SST) has been defined as sum of squared errors (SSE) and residual sum of squares (SSR). Consequently, it is a measure of the discrepancy between the data and the forecast model, which can be used as an optimal criterion in parameter selection and model selection (Azari et al. 2015; Phogat et al. 2016; Azari et al. 2019; Asdagh et al. 2021; Azari et al. 2021; Demirpolat, Aydoğmuş, and Arslanoğlu 2022).
Effect of MIST conditioning on the air voids and permeability of hot asphalt mixes containing reclaimed asphalt pavement
Published in Road Materials and Pavement Design, 2021
Burhan Showkat, Dharamveer Singh
Exponential, power law and linear models were fitted to the observed experimental results for control and RAP mixes. The degree of fitness of a model was judged based on the magnitude of R-squared (R2). Furthermore, R2 was computed as the following: Considering dataset of n values, , and n values predicted by the model, , then the residuals are . Furthermore, the total sum of squares , and the residual sum of squares . Finally, .