Symmetric rank-one method and its modifications for unconstrained optimization
The attention of this thesis is on the theoretical and experimental behaviors of some modifications of the symmetric rank-one method, one of the quasi-Newton update for finding the minimum of real valued function f over all vectors x ∈ Rn. Symmetric rank-one update (SR1) is known to have good num...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | http://psasir.upm.edu.my/id/eprint/70465/1/FS%202014%2044%20IR.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my-upm-ir.70465 |
---|---|
record_format |
uketd_dc |
spelling |
my-upm-ir.704652019-10-30T04:17:25Z Symmetric rank-one method and its modifications for unconstrained optimization 2014-06 Moyi, Aliyu Usman The attention of this thesis is on the theoretical and experimental behaviors of some modifications of the symmetric rank-one method, one of the quasi-Newton update for finding the minimum of real valued function f over all vectors x ∈ Rn. Symmetric rank-one update (SR1) is known to have good numerical performance among the quasi-Newton methods for solving unconstrained optimization problems. However, it is well known that the SR1 update may not preserve positive definiteness even when updated from a positive definite approximation and can be undefined with zero denominator. Thus, it is our aim in this thesis to provide effective remedies aimed toward dealing with these well known shortcomings and improve the performance of the update. A new inexact line search strategy in solving unconstrained optimization problems is proposed. This method does not require the evaluation of the objective function. Instead, it forces a reduction in gradient norm on each direction, hence it is suitable for problems when function evaluation is very costly. The convergence properties of this strategy is shown using the Lyapunov function approach. Similarly, we proposed some scaling strategies to overcome the challenges of the SR1 update. Under some mild assumptions, the convergence of these methods is proved. Furthermore, in order to exploit the good properties of the SR1 update in providing quality Hessian approximations, we introduced a three-term conjugate gradient method via the symmetric rank-one update in which a conjugate gradient line search direction is constructed without the computation and storage of matrices and possess the sufficient descent property. Extensive computational experiments performed on standard unconstrained optimization test functions and some real-life optimization problems in order to examine the impact of the proposed methods in comparison with other existing methods has shown significant improvement on the performance of the SR1 method in terms of efficiency and robustness. Newton-Raphson method 2014-06 Thesis http://psasir.upm.edu.my/id/eprint/70465/ http://psasir.upm.edu.my/id/eprint/70465/1/FS%202014%2044%20IR.pdf text en public doctoral Universiti Putra Malaysia Newton-Raphson method |
institution |
Universiti Putra Malaysia |
collection |
PSAS Institutional Repository |
language |
English |
topic |
Newton-Raphson method |
spellingShingle |
Newton-Raphson method Moyi, Aliyu Usman Symmetric rank-one method and its modifications for unconstrained optimization |
description |
The attention of this thesis is on the theoretical and experimental behaviors of
some modifications of the symmetric rank-one method, one of the quasi-Newton
update for finding the minimum of real valued function f over all vectors x ∈ Rn.
Symmetric rank-one update (SR1) is known to have good numerical performance
among the quasi-Newton methods for solving unconstrained optimization problems.
However, it is well known that the SR1 update may not preserve positive
definiteness even when updated from a positive definite approximation and can
be undefined with zero denominator. Thus, it is our aim in this thesis to provide
effective remedies aimed toward dealing with these well known shortcomings and
improve the performance of the update.
A new inexact line search strategy in solving unconstrained optimization problems
is proposed. This method does not require the evaluation of the objective
function. Instead, it forces a reduction in gradient norm on each direction, hence
it is suitable for problems when function evaluation is very costly. The convergence
properties of this strategy is shown using the Lyapunov function approach.
Similarly, we proposed some scaling strategies to overcome the challenges of the
SR1 update. Under some mild assumptions, the convergence of these methods is
proved. Furthermore, in order to exploit the good properties of the SR1 update
in providing quality Hessian approximations, we introduced a three-term conjugate
gradient method via the symmetric rank-one update in which a conjugate
gradient line search direction is constructed without the computation and storage
of matrices and possess the sufficient descent property. Extensive computational experiments performed on standard unconstrained optimization test functions and
some real-life optimization problems in order to examine the impact of the proposed
methods in comparison with other existing methods has shown significant
improvement on the performance of the SR1 method in terms of efficiency and
robustness. |
format |
Thesis |
qualification_level |
Doctorate |
author |
Moyi, Aliyu Usman |
author_facet |
Moyi, Aliyu Usman |
author_sort |
Moyi, Aliyu Usman |
title |
Symmetric rank-one method and its modifications for unconstrained optimization |
title_short |
Symmetric rank-one method and its modifications for unconstrained optimization |
title_full |
Symmetric rank-one method and its modifications for unconstrained optimization |
title_fullStr |
Symmetric rank-one method and its modifications for unconstrained optimization |
title_full_unstemmed |
Symmetric rank-one method and its modifications for unconstrained optimization |
title_sort |
symmetric rank-one method and its modifications for unconstrained optimization |
granting_institution |
Universiti Putra Malaysia |
publishDate |
2014 |
url |
http://psasir.upm.edu.my/id/eprint/70465/1/FS%202014%2044%20IR.pdf |
_version_ |
1747812844251906048 |