TY - JOUR T1 - Dealing with Multicollinearity in Regression Analysis: A Case in Psychology AU - Matshonisa Seeletse, Solly AU - Grace Phalwane, Motlalepula JO - Journal of Engineering and Applied Sciences VL - 15 IS - 13 SP - 2693 EP - 2703 PY - 2020 DA - 2001/08/19 SN - 1816-949x DO - jeasci.2020.2693.2703 UR - https://makhillpublications.co/view-article.php?doi=jeasci.2020.2693.2703 KW - Principal components KW -ridge regression KW -stepwise regression KW -multicollinearity KW -exploratory AB - In regression analysis, the main interest is to predict the response variable using the exploratory variables by estimating parameters of the linear model. However, in reality, the exploratory variables may share similar characteristics. This interdependency between the exploratory variables is called multicollinearity and causes parameter estimation in regression analysis to be unreliable. Different approaches to address the multicollinearity problem in regression modelling include variable selection, principal component regression and ridge regression. In this study, the performances of these techniques in handling multicollinearity in simulated data are compared. Out of the four regression models compared, principal regression model produced the best model to explain the variability and its parameter estimates were precise and addressing multicollinearity. ER -