Implementation of Symmetric Rank-One Methods for Unconstrained Optimization

The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or...

Full description

Saved in:
Bibliographic Details
Main Author: Khiyaban, Farzin Modarres
Format: Thesis
Language:English
English
Published: 2010
Subjects:
Online Access:http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-upm-ir.19590
record_format uketd_dc
spelling my-upm-ir.195902013-05-27T08:02:40Z Implementation of Symmetric Rank-One Methods for Unconstrained Optimization 2010-12 Khiyaban, Farzin Modarres The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis. Newton-Raphson method. 2010-12 Thesis http://psasir.upm.edu.my/id/eprint/19590/ http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf application/pdf en public phd doctoral Universiti Putra Malaysia Newton-Raphson method. Faculty of Science English
institution Universiti Putra Malaysia
collection PSAS Institutional Repository
language English
English
topic Newton-Raphson method.


spellingShingle Newton-Raphson method.


Khiyaban, Farzin Modarres
Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
description The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Khiyaban, Farzin Modarres
author_facet Khiyaban, Farzin Modarres
author_sort Khiyaban, Farzin Modarres
title Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_short Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_fullStr Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full_unstemmed Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_sort implementation of symmetric rank-one methods for unconstrained optimization
granting_institution Universiti Putra Malaysia
granting_department Faculty of Science
publishDate 2010
url http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf
_version_ 1747811418938277888