1Department of Statistics, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria
2Department of Statistics, Faculty of Science, Ladoke Akintola University of Technology, Ogbomoso, Tate, Nigeria
3Department of General Studies, School of Management, Federal Polytechnic Ayede, Oyo State Nigeria
4Department of Science Laboratory Technology, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria
Multicollinearity, arising from the violation of the independence assumption among explanatory variables in a linear regression model, poses a significant challenge to parameter estimation. It inflates the variances of the Ordinary Least Squares (OLS) estimates, leading to unstable coefficient estimates and unreliable inference. To mitigate this problem, several biased estimators such as the Ridge and Liu estimators have been developed. Recently, Kibria and Lukman (2020) introduced the Kibria–Lukman Estimator (KLE), a ridge-type alternative designed to improve estimation accuracy under multicollinearity. However, the efficiency of ridge-type estimators critically depends on the choice of the biasing parameter, which controls the trade-off between bias and variance. This study conducts a comprehensive evaluation of 25 existing ridge biasing parameters alongside three newly proposed parameters within the KLE framework. The proposed estimators were assessed using extensive Monte Carlo simulations under varying levels of multicollinearity and sample sizes. Performance was evaluated based on the Mean Squared Error (MSE) criterion. The results reveal that the proposed estimator, Ridge_kgk, consistently outperforms other competing estimators, demonstrating superior efficiency and stability across different data conditions. The findings highlight the potential of the new biasing parameters in enhancing the robustness and predictive accuracy of ridge-type estimators in multicollinearity regression settings.
Multiple Linear Regression (MLR) extends the simple linear regression framework by incorporating two or more explanatory variables into a single predictive model for a continuous response variable. The general form of the model is expressed as follows:
(1)
For i =
are regression coefficients,
are the independent variables,
is the dependent variable and
is the stochastic error term. In matrix form, the M equations can be written as:
Where y denotes an n × 1 vector of observed response, β represents a p × 1 vector of unknown regression coefficients, X is an n × p matrix of observed explanatory variables and e is an n × 1 vector of random error terms assumed to follow a multivariate normal distribution with mean vector 0 and covariance matrix σ2In, where In is an identity matrix of order n. The Ordinary Least Square (OLS) estimator of β is therefore expressed as:
The covariance matrix of β
is estimated as Cov (β ) =σ2(X'X)-1 . It is evident that both the estimator β and its covariance structure are highly dependent on the properties of the matrix X'X1.1 Ridge RegressionThe Ridge regression (RR), originally introduced in 1970 by Hoerl and Kennard, was developed to address the issue of multicollinearity commonly encountered in engineering and other empirical data analyses. Their pioneering study revealed that the introduction of a positive ridge parameter ???? leads to a ridge regression estimator whose Mean Squared Error (MSE) is lower than the variance of the Ordinary Least Squares (OLS) estimator, thereby achieving greater estimation efficiency through an optimal bias–variance trade-off. Consequently, the ridge regression estimator (RRE) is defined as follow:
Where M= [Ip+kZ-1]-1
1.2 The Kibria Lukman Estimator
The newly formulated one-parameter estimator is obtained by optimizing the following objective function, designed to balance bias and variance in the estimation process:
Minimization of the objective function with respect to β leads to the corresponding normal equations.
In this formulation, ???? represents a nonnegative constant. Solving the preceding equation with respect to ???? produces the explicit form of the proposed estimator as:
2.1 Canonical Form
The canonical form of the model is:
Where A = XP and α= P’β. Here, P is an orthogonal matrix such that
The ridge estimator (RE) of α is:
Where
Thus the MSE of the propose estimator can be written as:
Finally, the MSE of Kibria-Lukman estimator after using the above stated definitions can be written as:
Differentiating Pk with respect to k gives an
Raheed Saheed Lekan*, Owolabi Muhammed Ishola, James Olasunkanmi Oladapo, Olabode John Oluwasina, Fawolu Oluseyi Ajayi, Teliat Rasheed Olusanjo, Some Ridge Biasing Parameter for Linear Regression Model and Their Performances on Kibria-Lukman Estimator, Int. J. Sci. R. Tech., 2025, 2 (12), 14-23. https://doi.org/10.5281/zenodo.18118913
10.5281/zenodo.18118913