Explore chapters and articles related to this topic
Introduction
Published in Sugato Basu, Ian Davidson, Kiri L. Wagstaff, Constrained Clustering, 2008
Sugato Basu, Ian Davidson, Kiri L. Wagstaff
Along another research direction, efforts have been made to help both supervised and unsupervised learning with pairwise constraints [35,30,24,23, 3,14,15,32,34]. In the context of graph partitioning, Yu and Shi [35] have successfully integrated pairwise constraints into a constrained grouping framework, leading to improved segmentation results. Wagstaff et al. [30] introduced pairwise constraints into the k-means clustering algorithm for unsupervised learning problems. In more closely related work by Xing et al. [32], a distance metric learning method is proposed to incorporate pairwise information and solved by convex optimization. However, the method contains an iterative procedure with projection and eigenvalue decomposition which is computationally expensive and sensitive to parameter tuning. By comparison, relevant component analysis (RCA) [24] is a simpler and more efficient approach for learning a full Mahalanobis metric. A whitening transformation of the covariance matrix of all the center-points in the chunklets is computed as a Mahalanobis distance. However, only positive constraints can be utilized in this algorithm. In [23], Shental et al. propose a constrained Gaussian mixture model which incorporates both positive and negative pairwise constraints into a GMM model using the expectation-maximization (EM) algorithm. Basu et al. [3] studied a new approach for semi-supervised clustering by adding additional penalty terms into the objective function. They also proposed an
Non-dyadic wavelet decomposition for sensory-motor imagery EEG classification
Published in Brain-Computer Interfaces, 2020
Poonam Chaudhary, Rashmi Agrawal
The whitening transformation is performed due to the uncorrelation between the variables of the transformed matrix. It helps to maximize the difference between the two classes. So, CSP projection matrix uses the eigenvectors of the whitened covariance matrix of each class. The whitened co-variance matrix for each class can be generated by multiplication of the whitening matrix with a covariance matrix as shown in Equation 13.