Explore chapters and articles related to this topic
Delamination detection in composite laminates using auto-regressive models of vibration signals
Published in Alphose Zingoni, Insights and Innovations in Structural Engineering, Mechanics and Computation, 2016
D. Nardi, M. Pasquali, L. Lampani, P. Gaudenzi
Three different scatter matrices are defined: the class-scatter matrix Sc, the within-class scatter matrix Sw, and the between-class scatter matrix Sb.
Fast Robust Location and Scatter Estimation: A Depth-based Method
Published in Technometrics, 2023
Maoyu Zhang, Yan Song, Wenlin Dai
Rousseeuw and Driessen (1999) proposed the first computationally feasible algorithm, termed FASTMCD, for approximating the MCD subset. Specifically, they randomly constructed several initial subsets and applied two C-steps for each subset, yielding the 10 subsets with the lowest determinant. Then, they took C-step iteratively for these 10 subsets until the determinant sequence converged and eventually chose the MCD subset as the one leading to the smallest determinant. Given this, the computational efficiency of FASTMCD is thus roughly proportional to the number of the initial subsets. Hubert, Rousseeuw, and Verdonck (2012) proposed an alternative algorithm DetMCD, which replaces random initial subsets (of which there could be many) in FASTMCD with six well-designed deterministic estimators of , and also involves the C-step in a similar way. Denote the estimates as the location and scatter matrix estimates of the h-subset for which the determinant of the sample covariance matrix is as small as possible. Further, an additional reweighted step is employed in both algorithms to improve the efficiency of the estimators. To be more specific, the estimators are renewed as trimmed estimates for location and scatter, where Wi = 1 when and 0 otherwise, is the α-quantile of the distribution and .
Gender and Age Classification Enabled Blockschain Security Mechanism for Assisting Mobile Application
Published in IETE Journal of Research, 2021
Sapna Juneja, Sourav Jain, Aparna Suneja, Gurminder Kaur, Yasser Alharbi, Ali Alferaidi, Abdullah Alharbi, Wattana Viriyasitavat, Gaurav Dhiman
Suppose there is a two-dimensional dataset with two attributes, X1 and X2. For class C1, X1 and X2 = {(4,1), (2,4), (2,3), (3,6), (4,4)} and for class C2 as {(5,6), (7,9), (5,7), (6,8), (9,10)}. The following steps will be applied on this dataset. Find out the within-class scatter matrix, Swwhere S1 and S2 are the covariance matrix for classes C1 and C2, respectively. The covariance matrix is found out by: where mean (C1) is the mean of class C1 and x belongs to C1. As we have five data points in our data set, we get five possible matrices as the output of Equation (10). So, take the average of all those matrices and the resultant is S1. Similarly, find S2 and put these values. Calculate the Between class scatter matrix (Sb) Find the best Projection Vector. All data points are projected on the Projection Vector and it carries all the necessary features that are required by the data to build any ML application. Find Eigen Vectors and compute the Eigen vector of the largest Eigen value.Dimensions are reduced. where W is Projection Vector and x is the input data. When data samples are projected onto W (projection Vector), the separation between the classes is also maximised.
Multilinear principal component analysis for statistical modeling of cylindrical surfaces: a case study
Published in Quality Technology & Quantitative Management, 2018
Massimo Pacella, Bianca M. Colosimo
Consider a data set of N one-dimensional arrays of P variables, where the data set is summarized in a matrix (the rows represent the variables and the columns are the samples). Let represent the centered data matrix obtained from by subtracting the mean data samples from each sample. The covariance matrix of is , and is obtained re-scaling by a constant the scatter matrix of defined as . The Singular Value Decomposition (SVD) of is as follows. is a unitary matrix (i.e. it has columns which form an orthonormal basis of ), whose columns are the eigenvectors of the scatter matrix (and of ). These eigenvectors are called the left singular vectors of . is pseudodiagonal and contains the singular values of . Without loss of generality, the elements in are assumed arranged in a decreasing order (columns of and are correspondingly arranged). is a unitary matrix, whose columns are the eigenvectors of the scatter matrix , also called the right singular vectors of .