Implementation of Symmetric Rank-One Methods for Unconstrained Optimization

The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or...

Full description

Saved in:
Bibliographic Details
Main Author: Khiyaban, Farzin Modarres
Format: Thesis
Language:English
English
Published: 2010
Online Access:http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf
http://psasir.upm.edu.my/id/eprint/19590/
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Putra Malaysia
Language: English
English
id my.upm.eprints.19590
record_format eprints
spelling my.upm.eprints.195902013-05-27T08:02:40Z http://psasir.upm.edu.my/id/eprint/19590/ Implementation of Symmetric Rank-One Methods for Unconstrained Optimization Khiyaban, Farzin Modarres The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis. 2010-12 Thesis NonPeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf Khiyaban, Farzin Modarres (2010) Implementation of Symmetric Rank-One Methods for Unconstrained Optimization. PhD thesis, Universiti Putra Malaysia. English
institution Universiti Putra Malaysia
building UPM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Putra Malaysia
content_source UPM Institutional Repository
url_provider http://psasir.upm.edu.my/
language English
English
description The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis.
format Thesis
author Khiyaban, Farzin Modarres
spellingShingle Khiyaban, Farzin Modarres
Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
author_facet Khiyaban, Farzin Modarres
author_sort Khiyaban, Farzin Modarres
title Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_short Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_fullStr Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full_unstemmed Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_sort implementation of symmetric rank-one methods for unconstrained optimization
publishDate 2010
url http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf
http://psasir.upm.edu.my/id/eprint/19590/
_version_ 1643827081633595392