Explore chapters and articles related to this topic
Tensor Methods for Clinical Informatics
Published in Kayvan Najarian, Delaram Kahrobaei, Enrique Domínguez, Reza Soroushmehr, Artificial Intelligence in Healthcare and Medicine, 2022
Cristian Minoccheri, Reza Soroushmehr, Jonathan Gryak, Kayvan Najarian
Tensor decomposition and tensor completion methods are well-established techniques to be used either by themselves or as a preprocessing tool in combination with classical machine learning methods. Many of them are more powerful extensions of matrix methods and formulating problems in tensorial form often leads to better-behaved solutions. Compared to other machine learning techniques, tensor methods tend to be more easily interpretable. Furthermore, they often allow to efficiently combine heterogeneous datasets, which is especially relevant in the current age of big data. Within clinical informatics, there are types of data (EEG data, ECG data, EHR data) these techniques have been successfully applied to, resulting in a rich body of research, with ongoing attention as improvements are achieved.
Advance Methods
Published in Atsushi Kawaguchi, Multivariate Analysis for Neuroimaging Data, 2021
Since multimodal data has multiple data matrices, it can be considered as a data tensor by combining them into an array and tensor decomposition can also be applied. There are two types of tensor decomposition, CP (canonical polyadic) and Tucker, but Tucker is more generalized. Let X_ be n × N × M data tensorD Each axis of the tensor is called a mode and its length is called a dimension. Here, mode 1 represents an n-dimensional subject, mode 2 represents an N-dimensional spatial domain (voxel) and mode 3 represents an M-dimensional modality. Then the tensor decomposition is given by X_=C×1S×2W1×3W2 where the K1 × K2 × K3 tensor C is called a core tensor. Each dimension of this core tensor can be viewed as the number of components for each mode of the data tensor. In this way Tucker alters the number of components in each mode. S is n × K1 score matrix, W1 is the N × K1 weight matrix for spatial domain, and W2 is the M × K3 weight matrix for modalities. ×l is the mode l product (summed for mode l). The ijm-component is as follows.
Image inpainting via Smooth Tucker decomposition and Low-rank Hankel constraint
Published in International Journal of Computers and Applications, 2023
Jing Cai, Jiawei Jiang, Yibing Wang, Jianwei Zheng, Honghui Xu
For standard tensor decomposition problems, Tucker can be extended to handle corrupted data by decomposing the tensor into a core factor multiplied by several matrices along each mode. However, with high missing rates, the prior properties of visual images tend to be approximately represented in the real-world. Therefore, we introduce the Hankelization step to discover the intrinsic low-rank characteristic. Specifically, an Nth-order Hankel structured tensor has the following structure [30]: where is the operation that turns the input N-th order tensor into 2Nth-order one. The is the duplication matrix and contains identity matrices, which is shown in Figure 1.
An Adaptive Sampling Strategy for Online Monitoring and Diagnosis of High-Dimensional Streaming Data
Published in Technometrics, 2022
Ana María Estrada Gómez, Dan Li, Kamran Paynabar
The CANDECOMP/PARAFAC (CP) (Kiers 2000) tensor decomposition factorizes a tensor into a sum of rank-one tensors. For example, given a third-order tensor , its CP-decomposition is where represents the vector outer product, R is a positive integer approximating the rank of the tensor, and and for . We define the factor matrices as the combination of the vectors from the rank-one components: , and , and write The rank of a tensor is defined as the smallest number of rank-one tensors whose sum is exactly equal to . See (Kolda and Bader 2009) for a comprehensive overview of higher-order tensor decomposition and its applications.
Discussion of “A novel approach to the analysis of spatial and functional data over complex domains”
Published in Quality Engineering, 2020
Tensor decomposition is one of the most popular topics in tensor analysis, and can be considered as a high-dimensional version of matrix singular value decomposition. Two specific decomposition forms are usually adopted in tensor analysis. For a tensor CANDECOMP/PARAFAC (CP) decomposes a tensor into a sum of rank-one tensors as shown in Eq. (1). where each is a unit vector, and is the outer product. is the mode matrix of dimension- and is the rank.