Modified Quasi-Newton Methods For Large-Scale Unconstrained Optimization
The focus of this thesis is on finding the unconstrained minimizer of a function, when the dimension n is large. Specifically, we will focus on the wellknown class of optimization methods called the quasi-Newton methods. First we briefly give some mathematical background. Then we discuss the quas...
Saved in:
主要作者: | |
---|---|
格式: | Thesis |
语言: | English English |
出版: |
2003
|
主题: | |
在线阅读: | http://psasir.upm.edu.my/id/eprint/11702/1/FSAS_2003_60.pdf |
标签: |
添加标签
没有标签, 成为第一个标记此记录!
|
总结: | The focus of this thesis is on finding the unconstrained minimizer of a
function, when the dimension n is large. Specifically, we will focus on the wellknown
class of optimization methods called the quasi-Newton methods. First we
briefly give some mathematical background. Then we discuss the quasi-Newton's
methods, the fundamental method in underlying most approaches to the problems of
large-scale unconstrained optimization, as well as the related so-called line search
methods. A review of the optimization methods currently available that can be used
to solve large-scale problems is also given.
The mam practical deficiency of quasi-Newton methods is the high
computational cost for search directions, which is the key issue in large-scale
unconstrained optimization. Due to the presence of this deficiency, we introduce a
variety of techniques for improving the quasi-Newton methods for large-scale
problems, including scaling the SR1 update, matrix-storage free methods and the extension of modified BFGS updates to limited-memory scheme. Comprehensive
theoretical and experimental results are also given.
Finally we comment on some achievements in our researches. Possible
extensions are also given to conclude this thesis. |
---|