Scaled three-term conjugate gradient method via Davidon-Fletcher-Powell update for unconstrained optimization

This thesis focus on the development of Scaled Three-Term Conjugate Gradient Method via the Davidon-Fletcher-Powell (DFF) quasi-Newton update for unconstrained optimization. The DFP method possess the merits of Newton’s method and steepest descent method while overcoming their disadvantages. O...

Full description

Saved in:
Bibliographic Details
Main Author: Ibrahim, Arzuka
Format: Thesis
Language:English
Published: 2015
Subjects:
Online Access:http://psasir.upm.edu.my/id/eprint/67646/1/IPM%202015%2018%20IR.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This thesis focus on the development of Scaled Three-Term Conjugate Gradient Method via the Davidon-Fletcher-Powell (DFF) quasi-Newton update for unconstrained optimization. The DFP method possess the merits of Newton’s method and steepest descent method while overcoming their disadvantages. Over the years the DFP update has been neglected as a result of lacking the self correcting property for bad Hessian approximation. In this thesis, we proposed a Scaled Three-Term Conjugate Gradient Method by utilizing the DFP update for the inverse Hessian approximation via memoryless quasi Newton’s method which satisfies both the sufficient descent and the conjugacy conditions. The basic philosophy is to restart the DFP update with a multiple of identity matrix in every iteration. An acceleration scheme is incorporated in the proposed method to enhance reduction in function value. Numerical results from an implementation of the proposed method on some standard unconstrained optimization problem shows that the proposed method is promising and exhibits superior numerical performance in comparison with other well-known conjugate gradient methods.