An improved multi-step gradient-type method for large scale optimization

In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrained optimization. Our approach is based on constructing a new gradient-type method by means of interpolating curves. We measure the distances required to parameterize the interpolating polynomials via...

全面介紹

Saved in:
書目詳細資料
Main Authors: Farid, Mahboubeh, Leong, Wah June
格式: Article
語言:English
出版: Elsevier 2011
在線閱讀:http://psasir.upm.edu.my/id/eprint/24643/1/An%20improved%20multi.pdf
http://psasir.upm.edu.my/id/eprint/24643/
https://www.sciencedirect.com/science/article/pii/S0898122111003312?via%3Dihub
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrained optimization. Our approach is based on constructing a new gradient-type method by means of interpolating curves. We measure the distances required to parameterize the interpolating polynomials via a norm defined by a positive-definite matrix. By developing on implicit updating approach we can obtain an improved version of Hessian approximation in diagonal matrix form, while avoiding the computational expenses of actually calculating the improved version of the approximation matrix. The effectiveness of our proposed method is evaluated by means of computational comparison with the BB method and its variants. We show that our method is globally convergent and only requires O(n) memory allocations.