%0 Thesis %A Chen, Chuei Yee %D 2009 %G English %G English %T A Class of Diagonally Preconditioned Limited Memory Quasi-Newton Methods for Large-Scale Unconstrained Optimization %U http://psasir.upm.edu.my/id/eprint/7547/1/ABS_---__FS_2009_29.pdf %X The focus of this thesis is to diagonally precondition on the limited memory quasi-Newton method for large scale unconstrained optimization problem. Particularly, the centre of discussion is on diagonally preconditioned limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method. L-BFGS method has been widely used in large scale unconstrained optimization due to its effectiveness. However, a major drawback of the L-BFGS method is that it can be very slow on certain type of problems. Scaling and preconditioning have been used to boost the performance of the L-BFGS method. In this study, a class of diagonally preconditioned L-BFGS method will be proposed. Contrary to the standard L-BFGS method where its initial inverse Hessian approximation is the identity matrix, a class of diagonal preconditioners has been derived based upon the weak-quasi-Newton relation with an additional parameter. Choosing different parameters leads the research to some well-known diagonal updating formulae which enable the R-linear convergent for the L-BFGS method. Numerical experiments were performed on a set of large scale unconstrained minimization problem to examine the impact of each choice of parameter. The computational results suggest that the proposed diagonally preconditioned L-BFGS methods outperform the standard L-BFGS method without any preconditioning. Finally, we discuss on the impact of the diagonal preconditioners on the L-BFGS method as compared to the standard L-BFGS method in terms of the number of iterations, the number of function/gradient evaluations and the CPU time in second.