Explore chapters and articles related to this topic
Regularization and Kernel Methods
Published in Dirk P. Kroese, Zdravko I. Botev, Thomas Taimre, Radislav Vaisman, Data Science and Machine Learning, 2019
Dirk P. Kroese, Zdravko I. Botev, Thomas Taimre, Radislav Vaisman
where the coefficients β^0 and α^i only depend on the inner products {〈xi, xj〉}. We will see shortly that the representer Theorem 6.6 generalizes this result to a broad class of regularized optimization problems. ☞ 231
Controlling Sources of Inaccuracy in Stochastic Kriging
Published in Technometrics, 2019
At first glance, the stochastic kriging model, which assumes a Gaussian process mean with Gaussian noise, appears quite narrow and restrictive. In fact, the model is not as restrictive as it appears. In particular, if one believes that the target function f lies in a reproducing kernel Hilbert space (say, e.g., f has a fixed number of continuous partial derivatives), then a representer theorem (Schölkopf, Herbrich, and Smola 2001) ensures that the solution to a very broad range of loss or likelihood-based penalized regression problems has the form given in (2), although would be estimated differently and the regularizing matrix constructed differently, depending on the loss or likelihood. In practice, the stochastic kriging model is typically a high-accuracy nonparametric estimate of the underlying function f, and would represent a high-quality starting approximation for each of the three examples mentioned in the first paragraph of this article (turbulent flows, sexual transmissibility, and cardiovascular policy).
A Tweedie Compound Poisson Model in Reproducing Kernel Hilbert Space
Published in Technometrics, 2023
Yi Lian, Archer Yi Yang, Boxiang Wang, Peng Shi, Robert William Platt
We propose a nonparametric Tweedie model, in which the function f is chosen from a reproducing kernel Hilbert space . To learn the function f from the data , we minimize the following penalized negative log-likelihood function where is a generalized Tikhonov regularization defined in the Hilbert space. The optimization problem for f is infinite-dimensional, and f does not belong to some specific parametric family. The representer theorem (Wahba 1990) shows that f can be parameterized by a combination of kernel functions where is the ith row of the n × n kernel matrix , generated by a positive definite kernel function , and α0 and are the coefficients. This result allows f to have a “parametric form” with the finite-dimensional representation, the dimension is dependent of sample size n. We consider commonly used kernel functions including the Gaussian radial basis function (RBF) kernel and the Laplace kernel , where σ is some kernel parameter. Consequently, (8) is equivalent to which minimizes a smooth convex function of . We refer to this model as the Ktweedie model. The algorithms for optimizing (10) will be discuss in Section 3.1.