Explore chapters and articles related to this topic
A Self-Organizing Map-Based Spectral Clustering on Shortest Path of a Graph
Published in Siddhartha Bhattacharyya, Anirban Mukherjee, Indrajit Pan, Paramartha Dutta, Arup Kumar Bhaumik, Hybrid Intelligent Techniques for Pattern Analysis and Understanding, 2017
Parthajit Roy, Swati Adhikari, J. K. Mandal
Currently, spectral graph clustering is one of the most promising areas in the field of graph clustering. These methods are extensively used to cluster a very large dataset with more improved performance than the other data clustering methods. Spectral graph theory is mainly related to the eigenvalues and eigenvectors of the adjacency matrix and Laplacian matrix of a graph. The spectral properties of these two matrices are used for graph partitioning. The Laplacian spectrum of a graph gives minimum cut when it is used in graph partitioning.
Introduction to graph theory
Published in Karthik Raman, An Introduction to Computational Systems Biology, 2021
The Laplacian has many interesting properties and is very useful for graph clustering. An entire field of study, spectral graph theory, is dedicated to the study of eigenvalues and eigenvectors of the Laplacian (or even the adjacency) matrix [13].
Nyström-based spectral clustering using airborne LiDAR point cloud data for individual tree segmentation
Published in International Journal of Digital Earth, 2021
Yong Pang, Weiwei Wang, Liming Du, Zhongjun Zhang, Xiaojun Liang, Yongning Li, Zuyuan Wang
Based on the spectral graph theory, spectral clustering methods have recently shown attractive advantages in segmentation problems (Luxburg 2007). Unlike the classic -means clustering approach, there are no restrictions on data distribution for spectral clustering methods (Fowlkes et al. 2004). For the iterative two-class cut method (Reitberger et al. 2009), spectral clustering has a similar theoretical foundation but offers better effectiveness in multiclass problems (Luxburg 2007). Heinzel and Huber (2018) introduced spectral clustering in tree segmentation using dense terrestrial laser scanning data with a good performance. The biggest challenge of the spectral clustering method for LiDAR point cloud data is computational complexity. For a data set with points, memory is needed to construct the similarity matrix, and the complexity of its eigenvectors calculation is (Lin and Cohen 2010; Ye et al. 2016). Besides time consuming, the procedure poses the risk of memory overflow when is large for the large-scale dense LiDAR point cloud data.
A novel data representation framework based on nonnegative manifold regularisation
Published in Connection Science, 2021
Yan Jiang, Wei Liang, Jintian Tang, Hongbo Zhou, Kuan-Ching Li, Jean-Luc Gaudiot
From spectral graph theory, spectral clustering can be solved by Lagrange multiplier method and reformulated as generalised eigen decomposition problem, the solution of which is a collection of eigenvectors. However, the obtained continuous solution could deviate far from the discrete solution. Thus, clustering membership indicator matrix discretion is our main concern. Spectral rotation (Huang et al., 2013; Zelnik-Manor & Perona, 2005) relaxes continuous eigenvectors into cluster membership indicator matrix by a more effective way than k-means. Let be the eigenvector matrix, then the objective function of spectral rotation is as Equation (8), where denotes an indicator matrix, the unique 1 in each row vector and is an arbitrary orthogonal matrix. Cluster indicator matrix can be leaned by applying alternative optimisation method.
Sequential Laplacian regularized V-optimal design of experiments for response surface modeling of expensive tests: An application in wind tunnel testing
Published in IISE Transactions, 2019
Adel Alaeddini, Edward Craft, Rajitha Meka, Stanford Martinez
where can be set using cross validation. Let be a diagonal matrix, where the diagonal elements are calculated as , and is the ijth element of the similarity matrix . Also, let . The matrix is called the graph Laplacian in spectral graph theory (Chung, 1996). Also, let represents the matrix of all feasible design vectors and represents the matrix of measured design vectors. The solution to the minimization problem (4) is then given as follows: where is a identity matrix and is measurement for design vector . Like LS regression (Equations (1) to (3)), the Hessian of the LRLS is also part of the estimated coefficients equation: