Symmetric rank-one method and its modifications for unconstrained optimization
The attention of this thesis is on the theoretical and experimental behaviors of some modifications of the symmetric rank-one method, one of the quasi-Newton update for finding the minimum of real valued function f over all vectors x ∈ Rn. Symmetric rank-one update (SR1) is known to have good num...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | http://psasir.upm.edu.my/id/eprint/70465/1/FS%202014%2044%20IR.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The attention of this thesis is on the theoretical and experimental behaviors of
some modifications of the symmetric rank-one method, one of the quasi-Newton
update for finding the minimum of real valued function f over all vectors x ∈ Rn.
Symmetric rank-one update (SR1) is known to have good numerical performance
among the quasi-Newton methods for solving unconstrained optimization problems.
However, it is well known that the SR1 update may not preserve positive
definiteness even when updated from a positive definite approximation and can
be undefined with zero denominator. Thus, it is our aim in this thesis to provide
effective remedies aimed toward dealing with these well known shortcomings and
improve the performance of the update.
A new inexact line search strategy in solving unconstrained optimization problems
is proposed. This method does not require the evaluation of the objective
function. Instead, it forces a reduction in gradient norm on each direction, hence
it is suitable for problems when function evaluation is very costly. The convergence
properties of this strategy is shown using the Lyapunov function approach.
Similarly, we proposed some scaling strategies to overcome the challenges of the
SR1 update. Under some mild assumptions, the convergence of these methods is
proved. Furthermore, in order to exploit the good properties of the SR1 update
in providing quality Hessian approximations, we introduced a three-term conjugate
gradient method via the symmetric rank-one update in which a conjugate
gradient line search direction is constructed without the computation and storage
of matrices and possess the sufficient descent property. Extensive computational experiments performed on standard unconstrained optimization test functions and
some real-life optimization problems in order to examine the impact of the proposed
methods in comparison with other existing methods has shown significant
improvement on the performance of the SR1 method in terms of efficiency and
robustness. |
---|