Explore chapters and articles related to this topic
Linear Algebra for Machine Learning
Published in P. Kaliraj, T. Devi, Artificial Intelligence Theory, Models, and Applications, 2021
Today, ML algorithms have become an integral part of various industries, including business, finance, and healthcare. The linear algebra will help to develop a more in-depth understanding of the machine learning project that affords special graphical interpretations to work on – images, audio, video, and edge detection. Linear algebra is used to build better supervised as well as unsupervised machine learning algorithms. Machine learning algorithms have classifiers that train a part of the given dataset based on their categories. Another work of classifiers is to do away with errors from the data that has already been trained. It is at this stage that linear algebra comes in to help compute this complex and large dataset. It uses matrix decomposition techniques like Q-R and L-U decompositions to process and handles large data for different projects. Length squared sampling in matrices, singular value decomposition, and low-rank approximation are a few techniques that are widely used in data processing. Logistic regression, linear regression, decision trees, and support vector machines (SVM) are a few supervised learning algorithms that can create from scratch with the help of linear algebra. SVD is typically used in the principal component analysis (PCA), which in turn is widely used for feature extraction and for knowing how significant the relationship among the features is to an outcome. PCA is widely used in computer vision and image compression by reducing the storage space. It reduces computation time.
Solving systems of algebraic equations
Published in Victor A. Bloomfield, Using R for Numerical Analysis in Science and Engineering, 2018
The singular value decompositionsvd() in the base installation decomposes a rectangular matrix into the product UDVH, where D is a nonnegative diagonal matrix, U and V are unitary matrices, and VH denotes the conjugate transpose of V (or simply the transpose if V contains real numbers only). The singular values are the diagonal elements of D. For square matrices, svd() and eigen() give equivalent eigenvalues. In fact, the routines that R uses to calculate eigenvalues and eigenfunctions, based on LAPACK and its predecessor EISPACK, are based on SVD calculations.
Singular Value Decomposition–Principal Component Analysis-Based Object Recognition Approach
Published in D. P. Acharjya, V. Santhi, Bio-Inspired Computing for Image and Video Processing, 2018
Chiranji Lal Chowdhary, D.P. Acharjya
Singular value decomposition (SVD) [21] is often used to solve the eigen problem for the covariance matrix. It is also used in robust statistics, to solve over-determined systems of equations. As the underlying data represents gray-value images, the discussion is reduced to real-valued matrices. However, it works for square and non-singular matrices. Most of the modern SVD algorithms are based on the method of Golub and Reinsch [10]. SVD is one of the most important tools of numerical signal processing, and is employed in a variety of system and image processing applications, such as spectrum analysis, filter design, system identification, object recognition, etc.
Hyperspectral anomaly detection: a performance comparison of existing techniques
Published in International Journal of Digital Earth, 2022
Noman Raza Shah, Abdur Rahman M. Maud, Farrukh Aziz Bhatti, Muhammad Khizer Ali, Khurram Khurshid, Moazam Maqsood, Muhammad Amin
Another commonly used dimensionality reduction technique is SVD. This technique establishes a linear function transformation to convert the parameters into independent variables. Unlike PCA, this algorithm does not center the data before measuring the singular value decomposition. Therefore, it can effectively deal with sparse matrices. In our simulations, truncated SVD (Hansen 1990) (also known as latent semantic analysis) is applied on different hyperspectral datasets in which the number of components of interest is restricted. The number of components of truncated SVD varies from 2 to 179 in our simulation. Figure 19 shows the RX detector’s effects on the number of components with AUC. The singular vectors in SVD are the same as eigenvectors in PCA. Similarly, the eigenvalues generated from PCA are the squares of singular values in SVD. By using SVD, the least informative bands are removed. Figure 19 shows that most of the information can be found in the first few components on various datasets.
Deceptive Infusion of Data: A Novel Data Masking Paradigm for High-Valued Systems
Published in Nuclear Science and Engineering, 2022
Arvind Sundaram, Hany Abdel-Khalik, Ahmad Al Rashdan
In Secs. IV.A and IV.B, the extraction of inference metadata and their separability was demonstrated using statistical and AI/ML tools. In this section, we use SVD to extract information about both the fundamental and inference metadata from the PWR, DCPM, and DIOD data. SVD is a widely used analysis tool and is defined via an orthogonal transformation of the data onto a new set of coordinate axes ordered from greatest to least variance. The given dataset is decomposed as shown in Eq. (9), where describe the dominant features of the dataset and describe their respective coefficients..
Parameter tracking of time-varying Hammerstein-Wiener systems
Published in International Journal of Systems Science, 2021
As an online algorithm, the complexity of the algorithm must be considered. Ave method needs less computation. It is easy to combine with KF to realise online application. The time complexity of SVD is O(n3), where n is the maximum number of rows and columns of the matrix. If n is large, it consumes a lot of computation. For the H-W systems we studied, most of the linear dynamic characteristics can be described by the model shown in Equation (1) with the order less than 5. That means r and m will not be too large. In addition, a few basis functions are required to fit the nonlinear characteristics if the system works stably. Therefore, the orders of and will not be too high. SVD can also be combined with KF to realise online identification in practical application.