A scaled three-term conjugate gradient method for unconstrained optimization

Conjugate gradient methods play an important role in many fields of application due to their simplicity, low memory requirements, and global convergence properties. In this paper, we propose an efficient three-term conjugate gradient method by utilizing the DFP update for the inverse Hessian approxi...

Full description

Saved in:
Bibliographic Details
Main Authors: Arzuka, Ibrahim, Abu Bakar, Mohd Rizam, Leong, Wah June
Format: Article
Language:English
Published: SpringerOpen 2016
Online Access:http://psasir.upm.edu.my/id/eprint/54932/1/A%20scaled%20three-term%20conjugate%20gradient%20method%20for%20unconstrained%20optimization.pdf
http://psasir.upm.edu.my/id/eprint/54932/
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Putra Malaysia
Language: English
Description
Summary:Conjugate gradient methods play an important role in many fields of application due to their simplicity, low memory requirements, and global convergence properties. In this paper, we propose an efficient three-term conjugate gradient method by utilizing the DFP update for the inverse Hessian approximation which satisfies both the sufficient descent and the conjugacy conditions. The basic philosophy is that the DFP update is restarted with a multiple of the identity matrix in every iteration. An acceleration scheme is incorporated in the proposed method to enhance the reduction in function value. Numerical results from an implementation of the proposed method on some standard unconstrained optimization problem show that the proposed method is promising and exhibits a superior numerical performance in comparison with other well-known conjugate gradient methods.