Three term conjugate gradient for solving unconstrained optimization using exact line search / Alia Syafiqa Suhaimi & Nur Hasnah Md Hassan

Conjugate gradient (CG) methods are a family of significance methods for solving large-scale unconstrained optimization problems. This is because of the attractive features such as low memory requirement, simple computation and strong global convergence. In this paper, the efficiency of methods is d...

Full description

Saved in:
Bibliographic Details
Main Authors: Suhaimi, Alia Syafiqa, Md Hassan, Nur Hasnah
Format: Thesis
Language:English
Published: 2019
Subjects:
Online Access:https://ir.uitm.edu.my/id/eprint/41379/1/41379.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Conjugate gradient (CG) methods are a family of significance methods for solving large-scale unconstrained optimization problems. This is because of the attractive features such as low memory requirement, simple computation and strong global convergence. In this paper, the efficiency of methods is determined by the number of iterations and CPU time. The focus of this study will be on solving using exact line search to solve the problem using three-term conjugate gradient methods. Four well- known classical conjugate gradients methods which are Dai and Yuan (DY), Fletcher- Reeves (FR), Hestenes-Stiefel (HS) and Polack-Ribiere-Polyak (PRP) are tested with four test functions. For everv test function, twelve initial points from four geometrical quadrants are chosen, some are close to the solution and some are further away. In this paper, the solutions using exact line search possess global convergence properties. The numerical results are analysed and presented by performance profile introduced by Dolan and Moore.