Three term conjugate gradient for solving unconstrained optimization using exact line search / Alia Syafiqa Suhaimi & Nur Hasnah Md Hassan

Conjugate gradient (CG) methods are a family of significance methods for solving large-scale unconstrained optimization problems. This is because of the attractive features such as low memory requirement, simple computation and strong global convergence. In this paper, the efficiency of methods is d...

全面介紹

Saved in:
書目詳細資料
Main Authors: Suhaimi, Alia Syafiqa, Md Hassan, Nur Hasnah
格式: Thesis
語言:English
出版: 2019
主題:
在線閱讀:https://ir.uitm.edu.my/id/eprint/41379/1/41379.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Conjugate gradient (CG) methods are a family of significance methods for solving large-scale unconstrained optimization problems. This is because of the attractive features such as low memory requirement, simple computation and strong global convergence. In this paper, the efficiency of methods is determined by the number of iterations and CPU time. The focus of this study will be on solving using exact line search to solve the problem using three-term conjugate gradient methods. Four well- known classical conjugate gradients methods which are Dai and Yuan (DY), Fletcher- Reeves (FR), Hestenes-Stiefel (HS) and Polack-Ribiere-Polyak (PRP) are tested with four test functions. For everv test function, twelve initial points from four geometrical quadrants are chosen, some are close to the solution and some are further away. In this paper, the solutions using exact line search possess global convergence properties. The numerical results are analysed and presented by performance profile introduced by Dolan and Moore.