Gradient method with multiple damping for large-scale unconstrained optimization

Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, whi...

Full description

Saved in:
Bibliographic Details
Main Authors: Sim, Hong Seng, Leong, Wah June, Chen, Chuei Yee
Format: Article
Published: Springer 2019
Online Access:http://psasir.upm.edu.my/id/eprint/80005/
https://link.springer.com/article/10.1007/s11590-018-1247-9
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Putra Malaysia
id my.upm.eprints.80005
record_format eprints
spelling my.upm.eprints.800052023-05-31T02:00:16Z http://psasir.upm.edu.my/id/eprint/80005/ Gradient method with multiple damping for large-scale unconstrained optimization Sim, Hong Seng Leong, Wah June Chen, Chuei Yee Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, which aims only at reducing the function value, has been used in choosing the steplength, and this leads to a stable dynamic system giving slow convergence. To overcome this, we propose a new gradient method with multiple damping, which works on the objective function and the norm of the gradient vector simultaneously. That is, the proposed method is constructed by combining damping with line search strategies, in which an individual adaptive parameter is proposed to damp the gradient vector while line searches are used to reduce the function value. Global convergence of the proposed method is established under both backtracking and nonmonotone line search. Finally, numerical results show that the proposed algorithm performs better than some well-known CG-based methods. Springer 2019 Article PeerReviewed Sim, Hong Seng and Leong, Wah June and Chen, Chuei Yee (2019) Gradient method with multiple damping for large-scale unconstrained optimization. Optimization Letters, 13 (3). pp. 617-632. ISSN 1862-4472; ESSN: 1862-4480 https://link.springer.com/article/10.1007/s11590-018-1247-9 10.1007/s11590-018-1247-9
institution Universiti Putra Malaysia
building UPM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Putra Malaysia
content_source UPM Institutional Repository
url_provider http://psasir.upm.edu.my/
description Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, which aims only at reducing the function value, has been used in choosing the steplength, and this leads to a stable dynamic system giving slow convergence. To overcome this, we propose a new gradient method with multiple damping, which works on the objective function and the norm of the gradient vector simultaneously. That is, the proposed method is constructed by combining damping with line search strategies, in which an individual adaptive parameter is proposed to damp the gradient vector while line searches are used to reduce the function value. Global convergence of the proposed method is established under both backtracking and nonmonotone line search. Finally, numerical results show that the proposed algorithm performs better than some well-known CG-based methods.
format Article
author Sim, Hong Seng
Leong, Wah June
Chen, Chuei Yee
spellingShingle Sim, Hong Seng
Leong, Wah June
Chen, Chuei Yee
Gradient method with multiple damping for large-scale unconstrained optimization
author_facet Sim, Hong Seng
Leong, Wah June
Chen, Chuei Yee
author_sort Sim, Hong Seng
title Gradient method with multiple damping for large-scale unconstrained optimization
title_short Gradient method with multiple damping for large-scale unconstrained optimization
title_full Gradient method with multiple damping for large-scale unconstrained optimization
title_fullStr Gradient method with multiple damping for large-scale unconstrained optimization
title_full_unstemmed Gradient method with multiple damping for large-scale unconstrained optimization
title_sort gradient method with multiple damping for large-scale unconstrained optimization
publisher Springer
publishDate 2019
url http://psasir.upm.edu.my/id/eprint/80005/
https://link.springer.com/article/10.1007/s11590-018-1247-9
_version_ 1768009362905759744