Diagonal quasi-Newton updating formula via variational principle under the log-determinant measure
Quasi-Newton method has been widely used in solving unconstrained optimization problems. The popularity of this method is due to the fact that only the gradient of the objective function is required at each iterate. Since second derivatives (Hessian) are not required, quasi-Newton method is sometime...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2015
|
Online Access: | http://psasir.upm.edu.my/id/eprint/75682/1/Diagonal%20quasi-Newton%20updating%20formula%20via%20variational%20principle%20under%20the%20log-determinant%20measure.pdf http://psasir.upm.edu.my/id/eprint/75682/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Putra Malaysia |
Language: | English |
Summary: | Quasi-Newton method has been widely used in solving unconstrained optimization problems. The popularity of this method is due to the fact that only the gradient of the objective function is required at each iterate. Since second derivatives (Hessian) are not required, quasi-Newton method is sometimes more efficient than the newton method, especially when the computation of hessian is expensive. On the other hand, standard quasi-Newton methods required full matrix storage that approximates the (inverse) Hessian. Hence, they may not be suitable to handle problems of large-scale. In this paper, we develop quasi-Newton updating formula diagonally using log-determinant norm such that it satisfies the weaker secant equation. The Lagrangian dual of the variational problem is solved to obtain some approximations for the Lagrange multiplier that is associated with the weak secant equation. An executable code is developed to test the efficiency of the proposed method with some standard conjugate-gradient methods. Numerical results show that the proposed method performs better than the conjugate gradient method. |
---|