Explore chapters and articles related to this topic
Basic Computations
Published in Jhareswar Maiti, Multivariate Statistical Modeling in Engineering and Management, 2023
The OLS, GLS, and IRLS are the variants of WLS. Recall Equation 3.38.^=(XTWX)−1XTWyIf W=I,^=(XTX)−1XTy, which is the traditional OLS estimate.If W=E−1,^=(XTE−1X)−1XE−1y, which is the GLS estimate.IRLS differ from WLS in the computational aspects where the weight matrix W is updated iteratively.
Dynamic parameter identification and nonlinear friction compensation method for safety perception of heavy explosion-proof robots
Published in Journal of Control and Decision, 2022
The most common approaches to parameter identification for manipulators directly rely on Ordinary Least-Squares estimation (OLS) (Afrough & Hanieh, 2019) or Maximum Likelihood estimation (ML) (Young, 2015). Whether due to measurement noise, incorrect data-filtering or insufficient excitation, OLS parameter estimates may eventually prove to be highly biased and even could lead to physical infeasible solutions. The Maximum Likelihood (ML) algorithms can tackle the issue of noisy torques and joint angle measurements, at the price, however, the ML approaches need a much greater computational effort and potentially unable to provide consistent estimates. Weighted Least-Squares (WLS) (Liu et al., 2020), the variants of Least-Squares estimation, are more efficient and easy to implement, however, they lack noise immunity and exhibit a strong dependence on the conditioning of the robot’s trajectory, as Leboutet et al. (2021) depicted. The vulnerability of WLS to outliers can be addressed using the Iteratively Reweighted Least-Squares (IRLS) (Han et al., 2020). IRLS takes the form of an iterative process, which consists of applying an additional penalty to the outliers, in the form of a dedicated weight vector for each iteration, to eventually mitigate their contribution to the final result.
Application of robust estimation in geodesy using the harmony search algorithm
Published in Journal of Spatial Science, 2018
An important issue that must be tackled in RE is the realization of the methods. Traditionally, RE is performed using the iteratively reweighted least squares (IRLS) technique. The IRLS technique is efficient and easy to implement. Nonetheless, because this method is inconvenient for many complex robust estimation methods, recently some authors applied metaheuristic algorithms to compute the parameters (Baselga 2007, 2010, Baselga and Garcia-Asenjo 2008a, 2008b, 2008c, Yetkin and Berber 2013, 2014). Since RE is concerned with minimization of an objective function, it is also an optimization problem. Therefore, metaheuristic algorithms can be used to solve RE problems. Metaheuristic algorithms generate new solutions in the search space by applying operators to current solutions and statistically moving toward more optimal locations in the search space. They rely on an intelligent search of a solution space. These algorithms do not require the taking of cost function derivatives and emulate optimization processes encountered in nature that are remarkably successful at optimizing natural phenomena. Two major components of any metaheuristic algorithm are selection of the best solutions and randomization. The selection of the best solutions ensures that the solutions will converge to the optimal solution, while the randomization prevents the solutions from being trapped at local optima and increases the diversity of the solutions. A good merging of these two components generally ensures that global optimality is achievable. Some major metaheuristic algorithms are genetic algorithms, simulated annealing, ant colony optimization, particle swarm optimization, the firefly algorithm and the harmony search algorithm (Elbeltagi et al. 2005, Schneider and Kirkpatrick 2006, Geem 2009, 2010, Yang 2010, Yetkin et al. 2009, Yetkin 2013).