Explore chapters and articles related to this topic
The Interrelationship Between Flotation Variables and Froth Appearance
Published in J. S. Laskowski, E. T. Woodburn, Frothing in Flotation II, 2018
D.W. Moolman, C. Aldrich, J.S.J. Van Deventer
Apart from its computational complexity with regard to large data sets, the Sammon mapping algorithm as such is not useful for projecting data not contained in its training data base. When new data have to be evaluated, the entire map has to be regenerated ab initio. Moreover, storage of the map requires every single point to be stored in the map and the map cannot be represented in a compact form.
Robust monitoring of stochastic textured surfaces
Published in International Journal of Production Research, 2022
Many dissimilarity-based manifold learning algorithms (and their variants) are based on solving for the low-dimensional manifold representation () of a set of high-dimensional observations, where m(•) is some monotonic function, is some distance measure between the th and th observations, and are weights of the observations. For instance, in classical multidimensional scaling (cMDS, Torgerson 1952), Sammon mapping (SM, Sammon 1969), and curvilinear component analysis (CCA, Demartines and Herault 1997), m(•) is the identity function. However, these methods differ in the weights ’s. Specifically, cMDS sets and solves the optimisation problem via an eigendecomposition of the ‘doubly centered’ dissimilarity matrix (e.g. see Izenman 2013). Note that cMDS can only handle close to linear manifolds. On the other hand, SM sets , which focuses on the local behaviour, so that it can learn nonlinear manifolds. Similarly, CCA can learn nonlinear manifolds via local behaviour by setting as some decreasing function of .
Analyzing Nonparametric Part-to-Part Variation in Surface Point Cloud Data
Published in Technometrics, 2022
In Step 2 of the proposed framework, we apply a class of manifold learning algorithms that takes pairwise dissimilarities, obtained in Step 1, as input (as opposed to observational vectors themselves) to obtain the low-dimensional manifold coordinates of the N given point clouds. Some notable examples of dissimilarity-based manifold learning algorithms include classical multidimensional scaling (cMDS, Torgerson 1952), Sammon mapping (SM, Sammon 1969), curvilinear component analysis (CCA, Demartines and Hérault 1997), isometric mapping (ISOMAP, Tenenbaum, de Silva, and Langford 2000), t-distributed stochastic neighbor embedding (t-SNE, van der Maaten and Hinton 2008), and uniform manifold approximation and projection (UMAP, McInnes et al. 2018). Particularly, cMDS, SM, and CCA can be considered as instances of a broad class of multidimensional scaling techniques, which solves for from , where are weights of the observations and m(⋅) is some monotonic function. For cMDS, SM, and CCA, m(⋅) is the identity function. However, setting = 1 gives rise to cMDS, the solutions of which can be solved via an eigendecomposition of the “doubly centered” dissimilarity matrix (see, e.g., Izenman 2008, chap. 13 for details). Setting gives rise to SM, which can learn nonlinear manifolds because it emphasizes the local behavior through the weights ’s. Setting as some decreasing function of gives rises to CCA, which can also learn nonlinear manifolds. ISOMAP extends cMDS to handle nonlinear manifolds by applying cMDS to a “geodesic” distance matrix, instead of the original distance/dissimilarity matrix. The geodesic distance between two points on a manifold is defined as the shortest distance between these two points along the manifold. It can be found via minimizing the sum of distances along some path on the manifold between the two points that connects adjacent points. Instead of preserving distances, t-SNE aims to preserve conditional probabilities of picking a point as a neighbor of a given point. The solutions are obtained via minimizing the Kullback–Leibler divergence of the probability distribution in the manifold space from that in the data space. On the other hand, UMAP tries to preserve topological representations of ’s and the data, which are constructed from fuzzy simplicial set representations of local manifold approximations. The solutions are obtained via minimizing the cross-entropy of the two fuzzy sets.