A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization

The focus of this thesis is to diagonally precondition on the limited memory quasi-Newton method for large scale unconstrained optimization problem. Particularly, the centre of discussion is on diagonally preconditioned limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method. L-BFGS method h...

Full description

Saved in:
Bibliographic Details
Main Author: Chen, Chuei Yee
Format: Thesis
Language:English
English
Published: 2009
Subjects:
Online Access:http://psasir.upm.edu.my/id/eprint/7547/1/ABS_---__FS_2009_29.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-upm-ir.7547
record_format uketd_dc
spelling my-upm-ir.75472013-05-27T07:35:31Z A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization 2009 Chen, Chuei Yee The focus of this thesis is to diagonally precondition on the limited memory quasi-Newton method for large scale unconstrained optimization problem. Particularly, the centre of discussion is on diagonally preconditioned limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method. L-BFGS method has been widely used in large scale unconstrained optimization due to its effectiveness. However, a major drawback of the L-BFGS method is that it can be very slow on certain type of problems. Scaling and preconditioning have been used to boost the performance of the L-BFGS method. In this study, a class of diagonally preconditioned L-BFGS method will be proposed. Contrary to the standard L-BFGS method where its initial inverse Hessian approximation is the identity matrix, a class of diagonal preconditioners has been derived based upon the weak-quasi-Newton relation with an additional parameter. Choosing different parameters leads the research to some well-known diagonal updating formulae which enable the R-linear convergent for the L-BFGS method. Numerical experiments were performed on a set of large scale unconstrained minimization problem to examine the impact of each choice of parameter. The computational results suggest that the proposed diagonally preconditioned L-BFGS methods outperform the standard L-BFGS method without any preconditioning. Finally, we discuss on the impact of the diagonal preconditioners on the L-BFGS method as compared to the standard L-BFGS method in terms of the number of iterations, the number of function/gradient evaluations and the CPU time in second. Mathematical optimization - Case studies 2009 Thesis http://psasir.upm.edu.my/id/eprint/7547/ http://psasir.upm.edu.my/id/eprint/7547/1/ABS_---__FS_2009_29.pdf application/pdf en public masters Universiti Putra Malaysia Mathematical optimization - Case studies Faculty of Science English
institution Universiti Putra Malaysia
collection PSAS Institutional Repository
language English
English
topic Mathematical optimization - Case studies


spellingShingle Mathematical optimization - Case studies


Chen, Chuei Yee
A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
description The focus of this thesis is to diagonally precondition on the limited memory quasi-Newton method for large scale unconstrained optimization problem. Particularly, the centre of discussion is on diagonally preconditioned limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method. L-BFGS method has been widely used in large scale unconstrained optimization due to its effectiveness. However, a major drawback of the L-BFGS method is that it can be very slow on certain type of problems. Scaling and preconditioning have been used to boost the performance of the L-BFGS method. In this study, a class of diagonally preconditioned L-BFGS method will be proposed. Contrary to the standard L-BFGS method where its initial inverse Hessian approximation is the identity matrix, a class of diagonal preconditioners has been derived based upon the weak-quasi-Newton relation with an additional parameter. Choosing different parameters leads the research to some well-known diagonal updating formulae which enable the R-linear convergent for the L-BFGS method. Numerical experiments were performed on a set of large scale unconstrained minimization problem to examine the impact of each choice of parameter. The computational results suggest that the proposed diagonally preconditioned L-BFGS methods outperform the standard L-BFGS method without any preconditioning. Finally, we discuss on the impact of the diagonal preconditioners on the L-BFGS method as compared to the standard L-BFGS method in terms of the number of iterations, the number of function/gradient evaluations and the CPU time in second.
format Thesis
qualification_level Master's degree
author Chen, Chuei Yee
author_facet Chen, Chuei Yee
author_sort Chen, Chuei Yee
title A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
title_short A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
title_full A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
title_fullStr A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
title_full_unstemmed A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization
title_sort class of diagonally preconditioned limited memory quasi-newton methods for large-scale unconstrained optimization
granting_institution Universiti Putra Malaysia
granting_department Faculty of Science
publishDate 2009
url http://psasir.upm.edu.my/id/eprint/7547/1/ABS_---__FS_2009_29.pdf
_version_ 1747810698397745152