Explore chapters and articles related to this topic
Big Data over Wireless Networks (WiBi)
Published in Yulei Wu, Fei Hu, Geyong Min, Albert Y. Zomaya, Big Data and Computational Intelligence in Networking, 2017
The use of graphs helps with finding the correlations among seemingly disparate data. To represent structured and semistructured data as a graph, we can use the nodes to represent the various entities in the structured or semistructured data such as Name, Id etc., and use edges to connect one node to another one, as well as to other subfields. An example of representing structured and semistructured data as a graph is shown in Figure 4.3. Much research is currently under way to represent various forms of unstructured data as graphs and use them for analytics. This is called graph signal processing wherein images, video, and other unstructured data are represented as graphs upon which classical signal processing operations, such as graph Fourier transform, could be applied to extract useful patterns. An example of expressing an image as a graph is shown in Figure 4.4. After segmenting an image into various regions based on some preferred properties, a graph is constructed. Each node in the graph corresponds to a segmented region of the image and has the specific signal value associated with it, and the edges of the graph are used to connect different regions based on some distance measure functions between the segments. An illustration is shown in Figure 4.4. Similar techniques are used to represent 3D three-dimensional point Clouds, videos, etc.
Image Denoising with Nonlocal Spectral Graph Wavelets
Published in Olivier Lézoray, Leo Grady, Image Processing and Analysis with Graphs, 2012
David K. Hammond, Laurent Jacques, Pierre Vandergheynst
The operator L should be viewed as the graph analogue of the standard Laplacian operator −Δ for flat Euclidean domains. In particular, the eigenvectors of L are analogous to the Fourier basis elements eik·x, and may be used to define the graph Fourier transform. As L is a real symmetric matrix, it has a complete set of orthonormal eigenvectors χℓ ∈ ℝN for ℓ = 0, …, N − 1 with associated real eigenvalues λℓ. We order these in nondecreasing order, so that λ0 ≤ λ1 … ≤ λN−1. For the graph Laplacian it can be shown the eigenvalues are nonnegative, and that λ0 = 0. Now for any function f ∈ ℝN, the graph Fourier transform is defined by f^(ℓ)=〈χℓ,f〉=∑nχℓ*(n)f(n).
New Method of Transformer Differential Protection Based on Graph Fourier Transform
Published in Electric Power Components and Systems, 2023
Liao Xiaojun, XianZheng Feng, Xiaoru Wang, Zhang Li
The graph Fourier transform uses the eigenvector of the graph Laplacian matrix as the transform base, and the graph Fourier transform of the graph signal f is defined as
Graph convolutional networks with learnable spatial weightings for traffic forecasting applications
Published in Transportmetrica A: Transport Science, 2023
Bi Yu Chen, Yaohong Ma, Jiale Wang, Tao Jia, Xianglong Liu, William H. K. Lam
This section briefly describes the classical ConvGNNs to provide the necessary research background. Most classical ConvGNNs define the graph convolutional operator in the spectral domain by using the graph Fourier transform (Shuman et al. 2013). Assuming that graphs are undirected, the normalised graph Laplacian matrix is a real positive symmetric semi-definite matrix, where D is a diagonal matrix with . This matrix can be eigen-decomposed into , where is a set of orthonormal eigenvectors ordered by eigenvalues , … ,, and is the diagonal matrix of eigenvalues . Accordingly, given a graph signal vector during a specific time interval , the graph Fourier transform function is defined as and the inverse graph Fourier transform is defined as , where represents the resultant signal vector from the graph Fourier transform. Then, the graph convolutional operator, denoted by , for an input graph signal vector and a graph filter is defined as where is the output vector of the graph convolutional operator. Using this formulation, Bruna et al. (2014) proposed a ConvGNN, called spectral CNN, by learning the filter when training the convolution operator. The spectral CNN is powerful. However, its convolutional kernel covers the whole graph and does not hold the local connectivity property. Moreover, spectral CNN requires the eigen-decomposition of , which can be computationally expensive for large graphs.