Modifications of hestenes and stiefel conjugate gradient method for unconstrained optimization problems

Optimization refers to a common procedure applied within the science and engineering domain to determine variables that produce the best performance values. One of the most efficient techniques to solve large-scale unconstrained optimization issues is the conjugate gradient (CG) method, primarily du...

Full description

Saved in:
Bibliographic Details
Main Author: Talat Numan Alkhuli (Author)
Other Authors: x
Format: Thesis Book
Language:English
Subjects:
x
x
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Optimization refers to a common procedure applied within the science and engineering domain to determine variables that produce the best performance values. One of the most efficient techniques to solve large-scale unconstrained optimization issues is the conjugate gradient (CG) method, primarily due to its simplicity, low memory requirements, and global convergence properties. This method embeds n-step to attain a minimum point, where convergence properties are absent. Several techniques do not perform well in terms of the number of iteration and Central Processing Unit (CPU) time. In order to address these shortcomings, this study proposed new CG coefficients, Pk> which are Tal'at, Mamat, Rivaie (TMR) and Hybrid Tal'at and Mamat (HTM). TMR and HTM are modifications of Hestenes and Stiefel (HS) to enhance the capabilities of HS. The convergence properties of both methods were assessed, while numerical performance was evaluated via exact and inexact line searches. Both the TMR and HTM methods were examined by using 36 standard optimization test problems, along with three random initial points, beginning with a point close and a point far away from the solution point. All the standard optimization test problems were tested from small to large-scale dimensions, whereas numerical experiments were run via MATLAB R2015a using a computer with Intel® Core ? i3-M350 (2.27GHz) CPU, 4GB RAM. Analysis of the findings was discussed based on the number of iteration and CPU time using a performance profile in graphic form. The TMR was compared with other CG methods, namely Fletcher and Reeves (FR), HS, and Modified HS (MHS), whereas the HTM method was compared with Hybrid Hu and Storey (HHUS) and Hybrid Gilbert and Nocedal (HGN). The theoretical evidence showed that both TMR and HTM fulfilled sufficient descent condition, apart from exhibiting global convergence properties. The numerical findings revealed that TMR and HTM methods are better than the other CG approaches in terms of the number of iteration and CPU time. The TMR and HTM methods successfully solved all the test problems under exact line search, followed by HHUS and HGN approaches with a similar percentage at 98.46%. Meanwhile, MHS, HS, and FR methods solved the test problems at the following percentages; 97.77%, 91.11 %, and 67.00%, respectively. Both TMR and HTM, which were tested under inexact line search, resulted in the highest percentage at 98.88% and 99.22%, respectively, followed by HHUS and HGN with success rates at 95.29% and 94.9%. The other methods, namely FR, HS, and MHS, displayed success rates at 92.15%, 76.68%, and 65.09%, respectively. The efficiency exerted by TMR and HTM methods was assessed in real-life applications to solve issues related to regression analysis. All the findings showed that both TMR and HTM methods are indeed robust and superior, apart from showcasing better performance when compared to other methods, as well as appropriate for practical use.
Item Description:x
Physical Description:xv, 188 leaves; 31 cm.
Bibliography:Includes bibliographical references(leaves 138-146)
ISBN:x