View Article

Abstract

Multicollinearity, arising from the violation of the independence assumption among explanatory variables in a linear regression model, poses a significant challenge to parameter estimation. It inflates the variances of the Ordinary Least Squares (OLS) estimates, leading to unstable coefficient estimates and unreliable inference. To mitigate this problem, several biased estimators such as the Ridge and Liu estimators have been developed. Recently, Kibria and Lukman (2020) introduced the Kibria–Lukman Estimator (KLE), a ridge-type alternative designed to improve estimation accuracy under multicollinearity. However, the efficiency of ridge-type estimators critically depends on the choice of the biasing parameter, which controls the trade-off between bias and variance. This study conducts a comprehensive evaluation of 25 existing ridge biasing parameters alongside three newly proposed parameters within the KLE framework. The proposed estimators were assessed using extensive Monte Carlo simulations under varying levels of multicollinearity and sample sizes. Performance was evaluated based on the Mean Squared Error (MSE) criterion. The results reveal that the proposed estimator, Ridge_kgk, consistently outperforms other competing estimators, demonstrating superior efficiency and stability across different data conditions. The findings highlight the potential of the new biasing parameters in enhancing the robustness and predictive accuracy of ridge-type estimators in multicollinearity regression settings.

Keywords

Regression, Multicollinearity, Kibra –Lukman estimator, Simulation study, Mean Square Error

Introduction

Multiple Linear Regression (MLR) extends the simple linear regression framework by incorporating two or more explanatory variables into a single predictive model for a continuous response variable. The general form of the model is expressed as follows:

    (1)

For i =

are regression coefficients,

 are the independent variables,

is the dependent variable and

is the stochastic error term. In matrix form, the M equations can be written as:

(2)

Where y denotes an n × 1 vector of observed response, β represents a p × 1 vector of unknown regression coefficients, X is an n × p matrix of observed explanatory variables and e is an n × 1 vector of random error terms assumed to follow a multivariate normal distribution with mean vector 0 and covariance matrix σ2In, where In is an identity matrix of order n. The Ordinary Least Square (OLS) estimator of β is therefore expressed as:

The covariance matrix of  β

 is estimated as Cov (β ) =σ2(X'X)-1 . It is evident that both the estimator  β  and its covariance structure are highly dependent on the properties of the matrix X'X1.1 Ridge Regression

The Ridge regression (RR), originally introduced in 1970 by Hoerl and Kennard, was developed to address the issue of multicollinearity commonly encountered in engineering and other empirical data analyses. Their pioneering study revealed that the introduction of a positive ridge parameter ???? leads to a ridge regression estimator whose Mean Squared Error (MSE) is lower than the variance of the Ordinary Least Squares (OLS) estimator, thereby achieving greater estimation efficiency through an optimal bias–variance trade-off. Consequently, the ridge regression estimator (RRE) is defined as follow:

Where M= [Ip+kZ-1]-1

,k
 0, Z=X'X
, and Ip denotes an identity matrix of order p, This expression defines the ridge regression estimator. Since the matrix [Z+ k Ip] remains invertible for all k > 0, a unique solution for β(k) . The ridge estimator is inherently biased; however, for a positive ridge parameter k, it often achieves a smaller MSE than the OLS estimator. From equ.  (4), it follows that as k 0, β (k)→∞, β (k)0 . The parameter k often referred to as the ridge or biasing parameter, must be estimated from empirical data. In recent decades, considerable research efforts in the domains of multicollinearity and ridge regression estimation have focused on determining appropriate and efficient methods for estimating k. Numerous scholars have contributed to this line of inquiry, proposing various modified forms of ridge-type estimators. Notably, Hoerl and Kennard (1970) introduced the original ridge regression estimator, which was subsequently extended through the development of the Modified Ridge Regression (MRR) estimator, the Liu estimator (Liu, 1993), and more recently, the Kibria–Lukman estimator (Kibria and Lukman, 2020).

1.2 The Kibria Lukman Estimator

The newly formulated one-parameter estimator is obtained by optimizing the following objective function, designed to balance bias and variance in the estimation process:

Minimization of the objective function with respect to β  leads to the corresponding normal equations.

In this formulation, ???? represents a nonnegative constant. Solving the preceding equation with respect to ???? produces the explicit form of the proposed estimator as:

 and FK=IP-kZ-1
The proposed estimator, hereafter referred to as the Kibria–Lukman (KL) estimator, is denoted byβKL  and serves as a ridge-type modification of the conventional OLS estimator, and the biasing parameter k > (Kibria and Lukman 2020). As with any regression estimator, the determination of an appropriate biasing parameter in the recently developed Kibria–Lukman Estimator (KLE) is crucial for assessing its efficiency and overall performance. Over the years, several studies have proposed and examined various estimators for the ridge regression biasing parameter (k). Foundational contributions include those of Hoerl and Kennard (1970), Hoerl, Kennard, and Baldwin (1975), McDonald and Galarneau (1975), Lawless and Wang (1976), and Dempster, Schatzoff, and Wermuth (1977). Subsequent advancements were made by Gibbons (1981), Kibria (2003), Khalaf and Shukur (2005), Alkhamisi and Shukur (2008), Muniz and Kibria (2009), Muniz, Kibria, Mansson, and Shukur (2012), and Mansson, Shukur, and Kibria (2010). More recent developments include the works of Hefnawy and Farag (2013), Aslam (2014), and Arashi and Valizadeh (2015), Durogade (2016), Lukman and Ayinde (2017), Owolabi et al (2022), Kibra (2022), Adedoyin et al (2025) among others. Despite these extensive efforts, there has been limited discussion regarding the interplay between multicollinearity and the error variance (noise parameter) a situation in which a high degree of multicollinearity is often accompanied by inflated error variance. This challenge can substantially affect the performance of existing biasing parameter estimators. Therefore, in this study, we propose a new estimator for the biasing parameter (k) within the Kibria–Lukman Estimator (KLE) framework to effectively address this limitation.
  1. Statistical Methodology

2.1 Canonical Form

The canonical form of the model is:

Where A = XP and α= P’β. Here, P  is an orthogonal matrix such that

The OLS estimator of α  is:

The ridge estimator (RE) of α is:

Where 

 and k is the biasing parameter

Thus the MSE of the propose estimator can be written as:

Finally, the MSE of Kibria-Lukman estimator after using the above stated definitions can be written as:

Differentiating P with respect to k gives an

Reference

  1. Adedoyin, M. A., Oladapo, O. J., & Adejumo, A. O. (2025). A modified biasing ridge estimator for addressing multicollinearity problem in linear regression model. Journal of the Royal Statistical Society, Nigeria Group, 2(1). https://publications.funaab.edu.ng/index.php/JRSS-NIG/article/view/1943
  2. Alkhamisi, M. & Shukur, G. (2008). Developing ridge parameters for SUR model. Communications in Statistics – Theory and Methods, 37(4), 544-564. doi: 10.1080/03610920701469152.
  3. Alkhamisi, M., Khalaf, G., & Shukur, G. (2006). Some modifications for choosing ridge parameters. Communications in Statistics – Theory and Methods, 35(11), 2005-2020. doi: 10.1080/03610920600762905.
  4. Arashi, M. & Valizadeh, T. (2015). Performance of Kibria’s methods in partial linear ridge regression model. Statistical Papers, 56(1), 231-246. doi:10.1007/s00362-014-0578-6.
  5. Aslam, M. (2014). Performance of Kibria's method for the heteroscedastic ridge regression model: Some Monte Carlo evidence. Communications in Statistics – Simulation and Computation. 43(4), 673-686. doi:10.1080/03610918.2012.712185.
  6. Dempster, A. P., Schatzoff, M., & Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. Journal of the American Statistical Association, 72(357), 77-91. doi: 10.1080/01621459.1977.10479910.
  7. Dorugade, A.  V.  (2016).    New Ridge Parameters for Ridge Regression.  Journal of  the  Association  of  Arab Universities for Basic and Applied Sciences, 1-6
  8. Gibbons, D. G. (1981). A simulation study of some ridge estimators. Journal of the American Statistical Association, 76(373), 131-139. doi: 10.1080/01621459.1981.10477619.
  9. Hefnawy, E. A. & Farag A. (2013). A combined nonlinear programming model and Kibria method for choosing ridge parameter regression. Communications in Statistics – Simulation and Computation, 43(6). doi:10.1080/03610918.2012.735317.
  10. Hocking, R. R., Speed, F. M., & Lynn, M. J. (1976). A class of biased estimators in linear regression. Technometrics, 18(4), 55-67. doi:10.1080/00401706.1976.10489474.
  11. Hoerl, A. E. & Kennard, R. W. (1970). Ridge regression: Biased estimation for non-orthogonal problems. Technometrics, 12(1), 55-67. doi:10.1080/00401706.1970.10488634.
  12. Hoerl, A. E., Kennard, R. W., & Baldwin, K. F. (1975). Ridge regression: Some simulations. Communications in Statistics, 4(2), 105-123. doi:10.1080/03610927508827232.
  13. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., & Ayinde, K. (2022). On the biased two-parameter estimator to combat multicollinearity in linear regression model. African Scientific Reports, 1(3), 188–204.
  14. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., Ayinde, K., & Akinmoju, O. (2023). Combating multicollinearity: A new two-parameter approach. Nicel Bilimler Dergisi, 5(2), 90–116. https://doi.org/10.51541/nicel.1084768
  15. Khalaf, G. (2012). A proposed ridge parameter to improve the least squares estimator. Journal of Modern Applied Statistical Methods, 11(2), 443-449. Khalaf, G. & Shukur, G. (2005). Choosing ridge parameters for regression problems. Communications in Statistics – Theory and Methods, 34(5), 1177-1182. doi: 10.1081/STA-200056836.
  16. Khalaf, G. and G. Shukur, 2005. Choosing ridge parameter for regression problem. Commun. Stat. Theory Methods, 34: 1177-1182.
  17. Kibria, B. M. G. (2003). Performance of some new ridge regression estimators. Communications in Statistics – Simulation and Computation, 32(2), 419-435. doi: 10.1081/SAC-120017499.
  18. Kibria, B.M.G., and Lukman, A.F. 2020. A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications. Hindawi, Scientifica Article ID 9758378: 16 pages.
  19. Kibria, B. M. G. (2022). More than hundred (100) estimators for estimating the shrinkage parameter in linear and generalized linear ridge regression models. Journal of Econometrics and Statistics, 2(2), 233–252.
  20. Lukman, A.  F.  and Ayinde, K.  (2017).  Review and Classifications of the Ridge Parameter   Estimation Techniques.  Haccetteppe Journal of Mathematics and Statistics, 46 (5), 953-967
  21. Lawless, J. F. & Wang, P. (1976). A simulation study of ridge and other regression estimators. Communications in Statistics – Theory and Methods, 5(4), 307-323. doi: 10.1080/03610927608827353.
  22. Mansson, K., Shukur, G. & Kibria, B. M. G. (2010). On some ridge regression estimators: A Monte Carlo simulation study under different error variances. Journal of Statistics, 17(1), 1-22.
  23. McDonald, G. C. & Galarneau, D. I. (1975). A Monte Carlo evaluation of ridge-type estimators. Journal of the American Statistical Association, 70(350), 407-416. doi: 10.1080/01621459.1975.10479882, Communications in Statistics – Simulation and Computation, 38(3), 621-630. doi: 10.1080/03610910802592838.
  24. Muniz, G. and B.G. Kibria, 2009. On some ridge regression estimators: An empirical comparison. Commun. Stat. Simul. Comput., 38: 621-630.
  25. Muniz, G., Kibria, B. M. G., Mansson, K., & Shukur, G. (2012). On developing ridge regression parameters: A graphical investigation. Statistics and Operations Research Transactions, 36(2), 115-138.
  26. Nomura, M. (1988). On the almost unbiased ridge regression estimation. Communication in Statistics – Simulation and Computation, 17(3), 729-743. doi:10.1080/03610918808812690
  27. Oladapo, O. J., Owolabi, A. T., Idowu, J. I., & Ayinde, K. (2022). A new modified Liu ridge-type estimator for the linear regression model: Simulation and application. International Journal of Clinical Biostatistics and Biometrics, 8(2).
  28. Oladapo, O. J., Idowu, J. I., Owolabi, A. T., & Ayinde, K. (2023). A new biased two-parameter estimator in linear regression model. EQUATIONS, 3, 73–92. https://doi.org/10.37394/232021.2023.3.10
  29. Oladapo, O. J., Alabi, O. O., & Ayinde, K. (2024). Another new two-parameter estimator in dealing with multicollinearity in the logistic regression model. International Journal of Mathematical Sciences and Optimization: Theory and Applications, 10(2), 22–35. https://doi.org/10.5281/zenodo.10937145
  30. Owolabi, A. T., Ayinde, K., Idowu, J. I., Oladapo, O. J., & Lukman, D. F. (2022). A New Two-Parameter Estimator in the Linear Regression Model with Correlated Regressors. Journal of Statistics Applications & Probability, 11(2), 499-512. http://dx.doi.org/10.18576/jsap/110211.

Photo
Raheed Saheed Lekan
Corresponding author

Department of Statistics, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Photo
Owolabi Muhammed Ishola
Co-author

Department of Statistics, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Photo
James Olasunkanmi Oladapo
Co-author

Department of Statistics, Faculty of Science, Ladoke Akintola University of Technology, Ogbomoso, Tate, Nigeria

Photo
Olabode John Oluwasina
Co-author

Department of General Studies, School of Management, Federal Polytechnic Ayede, Oyo State Nigeria

Photo
Fawolu Oluseyi Ajayi
Co-author

Department of General Studies, School of Management, Federal Polytechnic Ayede, Oyo State Nigeria

Photo
Teliat Rasheed Olusanjo
Co-author

Department of Science Laboratory Technology, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Raheed Saheed Lekan*, Owolabi Muhammed Ishola, James Olasunkanmi Oladapo, Olabode John Oluwasina, Fawolu Oluseyi Ajayi, Teliat Rasheed Olusanjo, Some Ridge Biasing Parameter for Linear Regression Model and Their Performances on Kibria-Lukman Estimator, Int. J. Sci. R. Tech., 2025, 2 (12), 14-23. https://doi.org/10.5281/zenodo.18118913

More related articles
Formulation and Evaluation of Colon-Targeted Wheat...
Neelima Devi, Kasala Sindhuja, Kavvampalli Shirisha, Kemidi Srika...
The Transformative Drug Impact of AI in Pharmaceut...
Roshani Nikam, Priyanka Shinde, Karuna Sonawane, Jagruti Sonawane...
Evalution Of Phytochemicals, Antioxidant Potential, And Antibacterial Properties...
Nainesh Modi , Kinjal Damor , Riya Kadia, Milan Dabhi, Dweipayan Goswami , ...
In Vitro Anti-Inflammatory, Antiplatelet, And Antioxidant Activities of Cassia F...
P. Karthik, S. Swetha, P. Saranya, L. Gopi, Dr. V. Kalvimoorthi, ...
Related Articles
Advances in Microscopy Techniques for Biochemical Interactions and Cellular Imag...
Mohammad Javed, Sita Kumari, Twinkle Gupta, Smriti Gandha, ...
A Study on Design of G+1 Residential Building...
Sudhanshu Kumar Soni, Rahul Gupta, Jyotsna Kosley, Harshudha Bharti, Dr. Ajay Kumar Garg, ...
RNA Therapeutics: Clinical Pharmacokinetic and Therapeutic Monitoring...
Mayuri Jagtap, Sakshi Bodke, Avinash Darekar, ...
Formulation and Evaluation of Colon-Targeted Wheat Grass Tablets Using pH-Depend...
Neelima Devi, Kasala Sindhuja, Kavvampalli Shirisha, Kemidi Srikanth, Kokkonda Rajashekar Reddy, Kol...
More related articles
Formulation and Evaluation of Colon-Targeted Wheat Grass Tablets Using pH-Depend...
Neelima Devi, Kasala Sindhuja, Kavvampalli Shirisha, Kemidi Srikanth, Kokkonda Rajashekar Reddy, Kol...
The Transformative Drug Impact of AI in Pharmaceutical Drug Product Development...
Roshani Nikam, Priyanka Shinde, Karuna Sonawane, Jagruti Sonawane, Shraddha Vaishnav, ...
Formulation and Evaluation of Colon-Targeted Wheat Grass Tablets Using pH-Depend...
Neelima Devi, Kasala Sindhuja, Kavvampalli Shirisha, Kemidi Srikanth, Kokkonda Rajashekar Reddy, Kol...
The Transformative Drug Impact of AI in Pharmaceutical Drug Product Development...
Roshani Nikam, Priyanka Shinde, Karuna Sonawane, Jagruti Sonawane, Shraddha Vaishnav, ...