Explore chapters and articles related to this topic
Tensor Methods for Clinical Informatics
Published in Kayvan Najarian, Delaram Kahrobaei, Enrique Domínguez, Reza Soroushmehr, Artificial Intelligence in Healthcare and Medicine, 2022
Cristian Minoccheri, Reza Soroushmehr, Jonathan Gryak, Kayvan Najarian
Typically, like with matrix factorizations, one requires additional constraints on the factor matrices to help determine unique bases for the column spaces of the factor matrices. One very relevant example is that of Higher-Order Singular Value Decomposition, or HOSVD (De Lathauwer et al., 2000a) where the factor matrices are orthogonal matrices consisting of the leading left singular vectors of the matricizations of the tensor, yielding a core of size the multilinear rank of the tensor. HOSVD is not the optimal approximation in terms of the Frobenius norm but is a good generalization of Singular Value Decomposition for matrices (in fact, HOSVD for a second-order tensor reduces to SVD). For example, it can be used to obtain a low multilinear rank approximation of a tensor, by truncating singular vectors with small singular values from the factor matrices, which leads to a smaller core.
Big Data over Wireless Networks (WiBi)
Published in Yulei Wu, Fei Hu, Geyong Min, Albert Y. Zomaya, Big Data and Computational Intelligence in Networking, 2017
The above algorithm needs to be repeated for every change in tensor data such as the streaming updates illustrated in Figure 4.11a. A method to handle such a scenario is given in [1] where a recursive implementation of HOSVD is proposed. Other recent innovations to find the latent subspace of a matrix could also be adapted to be used with tensors. There are many matrix decomposition algorithms that could also be easily extended to tensors, like the Grassmannian robust adaptive subspace tracking algorithm (GRASTA) which is highly efficient with streaming data.
Image-Based Prognostics Using Penalized Tensor Regression
Published in Technometrics, 2019
Xiaolei Fang, Kamran Paynabar, Nagi Gebraeel
Using BIC for rank selection in the Tucker-based tensor regression model can be computationally prohibitive. For example, for a three-order tensor, there are totally rank candidates when the maximum rank in each dimensionality is 3. Increasing the maximum rank to 4 and 5, the number of rank candidates is increased to and , respectively. To address this challenge, we propose a computationally efficient heuristic method that automatically selects an appropriate rank. First, an initial coefficient tensor is estimated by regressing each pixel against the TTF. Next, high-order singular value decomposition (HOSVD) (De Lathauwer, De Moor, and Vandewalle 2000) is applied to the estimated tensor. HOSVD works by applying regular SVD to matricizations of the initial tensor on each mode. The rank of each mode can be selected by using fraction-of-variance explained (FVE) (Fang, Zhou, and Gebraeel 2015) and the resulting eigenvector matrix is the factor matrix for that mode. Given the initial tensor and its estimated factor matrices, we can estimate the core tensor. The core tensor and factor matrices estimated by HOSVD are used for initialization in Algorithm 2. As pointed out by various studies in the literature, HOSVD often performs reasonably well as an initialization method for iterative tensor estimation algorithms (Kolda and Bader 2009; Lu, Plataniotis, and Venetsanopoulos 2008).
Multilinear principal component analysis for statistical modeling of cylindrical surfaces: a case study
Published in Quality Technology & Quantitative Management, 2018
Massimo Pacella, Bianca M. Colosimo
A way to compute the HOSVD of the tensor is through the common SVD procedure on the results of matrix unfolding (i.e. the SVD of matrices , and in order to obtain matrices , and respectively). Then, the core tensor can be obtained from Equation (3) as .
A high fidelity cost efficient tensorial method based on combined POD-HOSVD reduced order model of flow field
Published in European Journal of Computational Mechanics, 2018
Mohammad Kazem Moayyedi, Milad Najaf beygi
HOSVD is a tensorial form of standard SVD, which only applies to matrices (Lorente et al., 2008). If is considered as a high-order tensor, it has been decomposed into the matrices. In this case, the snapshot tensor has been decomposed to the symmetric matrices to calculate the eigenvectors and eigenvalues. Consequently, it is possible to reconstruct a reduced order model of snapshot tensor. Tensor decomposition is expressed by the following equation: