Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
The focus of this thesis is on finding the unconstrained minimizer of a function by using the alternate steps gradient methods. Specifically, we will focus on the well-known classes of gradient methods called the steepest descent (SD) method and Barzilai-Borwein (BB) method. First we briefly give...
Saved in:
主要作者: | |
---|---|
格式: | Thesis |
語言: | English English |
出版: |
2009
|
主題: | |
在線閱讀: | http://psasir.upm.edu.my/id/eprint/12367/1/IPM_2009_11A.pdf |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|