The investigation of gradient method namely Steepest Descent and extending of Barzilai Borwein for solving unconstrained optimization problem / Nur Intan Syahirah Ismail & Nur Atikah Aziz

Steepest Descent is one of the pioneers method in solving optimization problem since it is globally convergence. Even though it is globally convergence, the convergence rate is still slow. Thus, in this project we focusing more on SD method and its modification, to get the better convergence rate. T...

全面介紹

Saved in:
書目詳細資料
Main Authors: Ismail, Nur Intan Syahirah, Aziz, Nur Atikah
格式: Thesis
語言:English
出版: 2019
主題:
在線閱讀:https://ir.uitm.edu.my/id/eprint/38918/1/38918.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Steepest Descent is one of the pioneers method in solving optimization problem since it is globally convergence. Even though it is globally convergence, the convergence rate is still slow. Thus, in this project we focusing more on SD method and its modification, to get the better convergence rate. This research is to investigate the behavior of gradient method namely Steepest Descent (SD), Barzilai Borwein 1(BB1), Barzilai Borwein 2(BB2) and Jaafar Mohamed (JM). This project is to analyse the performance of this four-method based on CPU time and number of iterations, and to show their global convergence. Eight test functions with several initial points from four geometrical quadrants has been chosen to test on SD, BB1, BB2 and JM method. The collected data is determined based on the number of iteration and CPU time by using exact line search. The results also show that all the methods possess global convergence. In order to analyse the best method geometrically, the performance profile by Dolan and Moore is used. Based on this performance profile, the best method will be determined. Lastly, numerical result will be generated from this research as a numerical prove to all the behaviour of the methods listed above.