Explore chapters and articles related to this topic
The Geometric Theory of Holomorphic Functions
Published in Steven G. Krantz, Complex Variables, 2019
Under this mapping the point ∞ is sent to −1, the point 1 is sent to (i −1)/(i+1) = i, and the point −1 is sent to (i −(−1))/(i+(−1)) = −i. Thus the image under the Cayley transform (a linear fractional transformation) of three points on ℝ ∪ {∞} contains three points on the unit circle. Since three points determine a (generalized) circle, and since linear fractional transformations send generalized circles to generalized circles, we may conclude that the Cayley transform sends the real line to the unit circle. Now the Cayley transform is one-to-one and onto from ℂ ⋃{∞} to ℂ ∪ {∞}. By continuity, it either sends the upper halfplane to the (open) unit disc or to the complement of the closed unit disc. The image of i is 0, so in fact the Cayley transform sends the upper halfplane to the unit disc.
Variable selection for Gaussian process regression through a sparse projection
Published in IISE Transactions, 2022
Chiwoo Park, David J. Borth, Nicholas S. Wilson, Chad N. Hunter
The authors stated that the q columns of could identify a few projection directions of the original inputs that are highly relevant to the response variable. However, there is an identifiability issue with because for an arbitrary orthonormal matrix O (including all rotation matrices) would achieve the same distance, and there are infinitely many versions of with different column directions that achieve the same factor distance. Tripathy et al. (2016) proposed the active subspace distance, where V is a q × p projection matrix with and D is a diagonal matrix of positives. In this parameterization, the projection matrix V defines a low-dimensional projection of the input features, and the diagonal matrix defines the weights on the input features. When the diagonal elements of D are all distinct, the columns of the matrix V are uniquely identified. The authors combined the Matérn 32 covariance with the active subspace distance. The iterative optimization for V and D is proposed based on the marginal likelihood maximization criterion. Since V is an orthogonal matrix, optimizing for V involves a complex orthogonality-preserving iteration based on the Cayley transform (Wen and Yin, 2013). This approach is useful for the DR. Sparsifying V for the VS while preserving the orthogonality is not straightforward.