Modified Quasi-Newton Methods For Large-Scale Unconstrained Optimization
The focus of this thesis is on finding the unconstrained minimizer of a function, when the dimension n is large. Specifically, we will focus on the wellknown class of optimization methods called the quasi-Newton methods. First we briefly give some mathematical background. Then we discuss the quas...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English English |
Published: |
2003
|
Subjects: | |
Online Access: | http://psasir.upm.edu.my/id/eprint/11702/1/FSAS_2003_60.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The focus of this thesis is on finding the unconstrained minimizer of a
function, when the dimension n is large. Specifically, we will focus on the wellknown
class of optimization methods called the quasi-Newton methods. First we
briefly give some mathematical background. Then we discuss the quasi-Newton's
methods, the fundamental method in underlying most approaches to the problems of
large-scale unconstrained optimization, as well as the related so-called line search
methods. A review of the optimization methods currently available that can be used
to solve large-scale problems is also given.
The mam practical deficiency of quasi-Newton methods is the high
computational cost for search directions, which is the key issue in large-scale
unconstrained optimization. Due to the presence of this deficiency, we introduce a
variety of techniques for improving the quasi-Newton methods for large-scale
problems, including scaling the SR1 update, matrix-storage free methods and the extension of modified BFGS updates to limited-memory scheme. Comprehensive
theoretical and experimental results are also given.
Finally we comment on some achievements in our researches. Possible
extensions are also given to conclude this thesis. |
---|