Symmetric rank-one method and its modifications for unconstrained optimization

The attention of this thesis is on the theoretical and experimental behaviors of some modifications of the symmetric rank-one method, one of the quasi-Newton update for finding the minimum of real valued function f over all vectors x ∈ Rn. Symmetric rank-one update (SR1) is known to have good num...

全面介紹

Saved in:
書目詳細資料
主要作者: Moyi, Aliyu Usman
格式: Thesis
語言:English
出版: 2014
主題:
在線閱讀:http://psasir.upm.edu.my/id/eprint/70465/1/FS%202014%2044%20IR.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:The attention of this thesis is on the theoretical and experimental behaviors of some modifications of the symmetric rank-one method, one of the quasi-Newton update for finding the minimum of real valued function f over all vectors x ∈ Rn. Symmetric rank-one update (SR1) is known to have good numerical performance among the quasi-Newton methods for solving unconstrained optimization problems. However, it is well known that the SR1 update may not preserve positive definiteness even when updated from a positive definite approximation and can be undefined with zero denominator. Thus, it is our aim in this thesis to provide effective remedies aimed toward dealing with these well known shortcomings and improve the performance of the update. A new inexact line search strategy in solving unconstrained optimization problems is proposed. This method does not require the evaluation of the objective function. Instead, it forces a reduction in gradient norm on each direction, hence it is suitable for problems when function evaluation is very costly. The convergence properties of this strategy is shown using the Lyapunov function approach. Similarly, we proposed some scaling strategies to overcome the challenges of the SR1 update. Under some mild assumptions, the convergence of these methods is proved. Furthermore, in order to exploit the good properties of the SR1 update in providing quality Hessian approximations, we introduced a three-term conjugate gradient method via the symmetric rank-one update in which a conjugate gradient line search direction is constructed without the computation and storage of matrices and possess the sufficient descent property. Extensive computational experiments performed on standard unconstrained optimization test functions and some real-life optimization problems in order to examine the impact of the proposed methods in comparison with other existing methods has shown significant improvement on the performance of the SR1 method in terms of efficiency and robustness.