Explore chapters and articles related to this topic
Harmony Search
Published in Nazmul Siddique, Hojjat Adeli, Nature-Inspired Computing, 2017
As mentioned in earlier sections, an efficient local search technique can improve the quality of solution provided by HS and improve convergence. The BFGS is a quasi-Newton method widely used as a local search method for solving an unconstrained nonlinear optimization problem (Yang, 2010). The basic idea is the replacement of Hessian matrix H by an approximate matrix B in terms of an iterative updating formula with rank-one matrices* as its increment. The minimization of a function f(x) with no constraint, the search direction sk at each iteration is given by Bksk=−∇f(xk) where Bk is the approximation of the Hessian matrix H at the k-th iteration.
Probabilistic power flow calculation of power system considering DGs based on improved LHS
Published in Rodolfo Dufo-López, Jaroslaw Krzywanski, Jai Singh, Emerging Developments in the Power and Energy Industry, 2019
It outperforms all other non-linear programming methods and perfume better than a large number of test problems in terms of efficiency, accuracy, and percentage of successful solutions. This method is very similar to Newton’s constrained optimization method, just like unconstrained optimization. In each iteration, the Hessian of the Lagrangian function is approximated using the Broden-Fletcher-Goldfab-Shanno (BFGS) quasi-Newton correction method. A quadratic programming (QP) subproblem is then generated using the approximated results, the solution of which is used to form the search direction of the line search process. Since the minimization is non-convex, SQP ensures a local minimum of the initial solution.
Introduction
Published in Randall L. Eubank, Ana Kupresanin, Statistical Computing in C++ and R, 2011
Randall L. Eubank, Ana Kupresanin
The minuslogl argument is the negative of the logarithm of the likelihood function that is to be minimized. The start parameter is a list of starting values for the minuslogl arguments specified by the parameter name. The fixed parameter is a similar list that designates arguments that are to be held fixed while assigning them specific values. The default optimization method is the quasi-Newton BFGS algorithm. But, any of the other methods available for optim can be used as with their arguments specified via the ellipsis. To illustrate the use of mle, consider the function
Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
Published in Optimization, 2023
Mohammad Javad Ebadi, Amin Fahs, Hassane Fahs, Razieh Dehghani
The BFGS method was proved to be one of the most efficient quasi-Newton methods for solving unconstrained optimization problems. An excellent presentation of the theoretical aspects concerning the properties and the convergence of this method were provided by Dennis and Moré [10] and [11]. Under convexity assumption on the objective function, the BFGS method was shown to be globally and locally superlinearly convergent (see [12,13]). Dai [14] constructed an example to show that the standard BFGS method may fail for non-convex functions with inexact line search (3)–(4). Mascarenhas [15] showed that the standard BFGS method may not be convergent even with an exact line search. To overcome this difficulty, Yuan et al. [2] introduced the inexact line search technique and where , and . They showed that under this modified WWP line search and some other assumptions BFGS algorithm is convergence for general smooth functions.
A Tweedie Compound Poisson Model in Reproducing Kernel Hilbert Space
Published in Technometrics, 2023
Yi Lian, Archer Yi Yang, Boxiang Wang, Peng Shi, Robert William Platt
We propose an inverse BFGS algorithm, which belongs to the Quasi-Newton methods, to solve the optimization problem (10). The BFGS enjoys the fast convergence rate of the Newton-type algorithms and avoids the exact computation and inverse of the Hessian matrix whose dimension is equal to the sample size. Solving the Ktweedie model requires additional considerations for the intercept α0, which is discussed in Section B. Here we first solve a simpler variant: , where is the objective function in (10) without the intercept. We use superscript (k) to indicate kth iteration of our algorithm. We first update by where is the step size, is an approximate inverse Hessian matrix of the objective and is the gradient:
A robust BFGS algorithm for unconstrained nonlinear optimization problems
Published in Optimization, 2022
We have proposed a robust BFGS algorithm that converges to a local optimum under some mild assumptions for not only convex optimization problems but also for non-convex optimization problems. In addition, we proved that the robust BFGS algorithm is globally and superlinearly convergent for the worst case, the non-convex optimization problems. We have shown that the computational cost in each iteration is almost the same for both the BFGS algorithm and the robust BFGS algorithm. We have provided numerical test results and compared the performance of the robust BFGS to the performance of other established and state-of-the-art algorithms, such as BFGS, limited memory BFGS, descent and conjugate gradient, and limited memory descent and conjugate gradient. The results and comparison show that the robust BFGS algorithm appears very efficient and effective.