Explore chapters and articles related to this topic
Fundamental Pattern Recognition Concepts
Published in Manas Kamal Bhuyan, Computer Vision and Image Processing, 2019
The basic differences between PCA and ICAICA can be listed as follows: PCA employs an orthogonal transformation to convert a set of correlated variables into a set of linearly uncorrelated variables (orthogonal basis set). On the other hand, ICA determines the independent components by minimizing the statistical independence of the estimated components by decomposing a multivariate signal into independent non-Gaussian signals.PCA tries to find directions where variance of data is maximum through independent vectors using successive approximations; whereas, ICA finds additively separable components corresponding to the axes of the data which are independent vectors.PCA removes correlation, but not higher-order dependencies of data. However, ICA removes correlation and also higher-order dependencies of data.In PCA, some components are more important (corresponding to higher eigenvalues). In ICA, all the components are equally important.In PCA, vectors are orthogonal since eigenvectors of covariance matrix are orthogonal. In ICA, vectors are not orthogonal.
Principles of Neural Network Learning
Published in Paul R. Prucnal, Bhavin J. Shastri, Malvin Carl Teich, Neuromorphic Photonics, 2017
Paul R. Prucnal, Bhavin J. Shastri, Malvin Carl Teich
Linsker proposed that each processing stage (a neuron or a neural layer) follow a general principle called maximum information preservation, dubbed Infomax[3]. The principle states that the goal of a layered perceptron neural network’s learning procedure is to maximize the mutual information between its output and inputs [4]. Linsker discovered interesting consequences of the infomax principle. For example, it leads to a population of neurons selecting features with high signal-to-noise ratio. Algorithms that follow this principle are of particular interest because they can be implemented using locally available information, allowing for distributed, unsupervised adaptation. This can essentially be done by two well-known techniques: principal component analysis (PCA) and independent component analysis (ICA) on input signals [5]. PCA maximizes the variance (or second-order moment) of the output, whereas ICA maximizes its kurtosis (or fourth-order moment).
Sonar Performance Models
Published in Paul C. Etter, Underwater Acoustic Modeling and Simulation, 2017
Benchekroun and Mansour (2006) applied a blind separation of sources (BSS) to the solution of MIMO channel problems in ocean-acoustic tomography (OAT), specifically a variant of OAT called passive acoustic tomography (PAT) in which a cooperative source is replaced by a noncooperative noise source such as a ship of opportunity. BSS refers to a method for retrieving unknown mixed, independent sources from their observed mixture. Typically, researchers use independent-component-analysis (ICA) algorithms based on the independent assumption of the sources. For example, Xu et al. (2010) used blind-source separation to segregate sonar data into a reverberation component and a target component. Also see the work reported by Zhang et al. (2000).
Dynamic inner independent component analysis-based incipient fault detection for electric drive systems of high-speed trains
Published in Journal of Control and Decision, 2023
Hongmei Wang, Jingkun Wang, Shuiqing Xu, Chao Cheng, Qiang Liu, Hongtian Chen
Non-Gaussian signals are the main signals of electric drive systems. ICA has some advantages over traditional data processing techniques, such as PCA, in dealing with non-Gaussian signals. The core of ICA is to estimate the original independent component Y and the mixing matrix A from the observed data X. The main objective is to compute a separation matrix W to obtain an estimate of the independent component S. The key to ICA-based model estimation is the degree of deviation of the independent component from the normal distribution, i.e. the non-Gaussian measure. Negentropy is chosen here as the evaluation index of non-Gaussianity, and the maximum negative entropy objective function of the independent component can be expressed as follows. where Y is the estimate of the independent component, represents the negative entropy of the independent component estimate, represents the nonlinear function, represents the mean operation, and V is a Gaussian variable with zero mean and unit variance. For nonlinear functions, the following approach can be taken. The next step is to obtain detection thresholds by statistical methods to detect changes in the system.
An Independent Component Analysis Approach for Wide-Area Monitoring of Power System Disturbances
Published in Electric Power Components and Systems, 2020
José de Jesús Nuño Ayón, Jorge Luis García Sánchez, Eduardo Salvador Bañuelos Cabral, Julián Sotelo Castañón, María José Rodríguez Roblero
ICA is a multivariate statistical method that allows extracting hidden components from a set of observed variables or measurements. The ICA method solves the blind source separation (BSS) problem, which consists in recovering source signals (independent components) from the mixtures of these signals (observed variables) without additional information about the source properties or the mixing characteristics. Also, the ICA technique is one of the most popular BSS methods where the following assumptions are made: (i) the mixing matrix has full column rank, but it is an unknown matrix; (ii) the source signals are assumed to be statistically independent and have a unit variance; and (iii) the source signals are assumed to be non-Gaussian distributions or have at least one Gaussian distribution [16].
Identifying nonlinear variation patterns with deep autoencoders
Published in IISE Transactions, 2018
Phillip Howard, Daniel W. Apley, George Runger
Blind Source Separation (BSS) methods such as Independent Component Analysis (ICA) are closely related to our task of identifying unique variation patterns. Whereas PCA seeks orthogonal directions that are efficient for representing the original data, ICA searches for directions that minimize the statistical dependence between them (Comon, 1994). However, ICA discovers only linear patterns, making it unsuitable for our task of blindly identifying nonlinear variation patterns. BSS methods for the case of linear patterns have been proposed with applications to manufacturing variability analysis in Apley and Lee (2003) and Shan and Apley (2008).