Comparison of parameter estimation methods when multicollinearity and outlier exists / Aida Nurasikin Jamil ...[et al.]

Ordinary Least Squares (OLS) estimator become worse in the presence of multicollinearity and outlier. Here, three methods are suggested to improve the model when multicollinearity and outlier exists, the first one is Jackknife Regression (JR) based on left out method, the second is Ridge Regression...

Full description

Saved in:
Bibliographic Details
Main Authors: Jamil, Aida Nurasikin, Abdul Muluk, Muhammad Fahmi, Anuar, Nur Sabrina, Abu Bakar, Mohamad Suffian
Format: Thesis
Language:English
Published: 2019
Subjects:
Online Access:https://ir.uitm.edu.my/id/eprint/32559/1/32559.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ordinary Least Squares (OLS) estimator become worse in the presence of multicollinearity and outlier. Here, three methods are suggested to improve the model when multicollinearity and outlier exists, the first one is Jackknife Regression (JR) based on left out method, the second is Ridge Regression (RR) based on the addition of shrinking parameter, and the third is Latent Root Regression (LRR) by adding the latent root and latent vector. In the application, model parameters, standard errors, length of confidence intervals (L.C.I), coefficients of determination ( 2 R ), and mean square error (MSE) of these methods are estimated. Next, the perfomance of these three methods are compared with OLS by using the MSE and 2 R .Based on the analysis, LRR method was the best method compared to other methods since the value of MSE is less and 2 R is higher among others. The LRR was not only the best method when multicollinearity exist, but also was the best when the presence of both multicollinearity and outlier