Explore chapters and articles related to this topic
Shrinkage Methods
Published in Julian J. Faraway, Linear Models with Python, 2021
There are other choices within the scikit-learn package that can be called shrinkage methods. The Elastic-Net method combines the ridge and lasso ideas by having both an L1 and an L2 penalty. This allows for some predictors to be dropped as in lasso while still retaining the regularization advantages of ridge. The Least Angle Regression (LARS) method is related to lasso in its preference for models with a reduced number of predictors. The Orthogonal Matching Pursuit method goes further than lasso in just encouraging the elimination of predictors — it specifies a maximum number of nonzero coefficients. The scikit-learn package also contains a Bayesian regression implementation. By imposing weakly informative priors on the parameters we achieve a similar effect to ridge regression. It is possible to make an exact identification between the two methods.
Normal mode analysis of a relaxation process with Bayesian inference
Published in Science and Technology of Advanced Materials, 2020
Itsushi Sakata, Yoshihiro Nagano, Yasuhiko Igarashi, Shin Murata, Kohji Mizoguchi, Ichiro Akai, Masato Okada
In this section, we deal with the aforementioned ‘polishing’ method in a Bayesian inference. The ‘polishing’ has two different purposes: mode search by regularization and regression. Least angle regression and ordinary least squares (LARS-OLS) [37] is a framework of regression analysis that performs basis search and regression in a stepwise manner. Igarashi et al. considered the LARS-OLS as Bayesian inference (called Bayesian LARS-OLS) and proposed a method that achieves the basis search in the Bayesian free energy criterion [21,22]. We extend their method to be able to apply the material’s dynamics such as a relaxation process. In Bayesian LARS-OLS framework, we consider DMD as a problem of linear regression and express it by a probability model.