View Article

Abstract

Multicollinearity, arising from the violation of the independence assumption among explanatory variables in a linear regression model, poses a significant challenge to parameter estimation. It inflates the variances of the Ordinary Least Squares (OLS) estimates, leading to unstable coefficient estimates and unreliable inference. To mitigate this problem, several biased estimators such as the Ridge and Liu estimators have been developed. Recently, Kibria and Lukman (2020) introduced the Kibria–Lukman Estimator (KLE), a ridge-type alternative designed to improve estimation accuracy under multicollinearity. However, the efficiency of ridge-type estimators critically depends on the choice of the biasing parameter, which controls the trade-off between bias and variance. This study conducts a comprehensive evaluation of 25 existing ridge biasing parameters alongside three newly proposed parameters within the KLE framework. The proposed estimators were assessed using extensive Monte Carlo simulations under varying levels of multicollinearity and sample sizes. Performance was evaluated based on the Mean Squared Error (MSE) criterion. The results reveal that the proposed estimator, Ridge_kgk, consistently outperforms other competing estimators, demonstrating superior efficiency and stability across different data conditions. The findings highlight the potential of the new biasing parameters in enhancing the robustness and predictive accuracy of ridge-type estimators in multicollinearity regression settings.

Keywords

Regression, Multicollinearity, Kibra –Lukman estimator, Simulation study, Mean Square Error

Introduction

Multiple Linear Regression (MLR) extends the simple linear regression framework by incorporating two or more explanatory variables into a single predictive model for a continuous response variable. The general form of the model is expressed as follows:

    (1)

For i =

are regression coefficients,

 are the independent variables,

is the dependent variable and

is the stochastic error term. In matrix form, the M equations can be written as:

(2)

Where y denotes an n × 1 vector of observed response, β represents a p × 1 vector of unknown regression coefficients, X is an n × p matrix of observed explanatory variables and e is an n × 1 vector of random error terms assumed to follow a multivariate normal distribution with mean vector 0 and covariance matrix σ2In, where In is an identity matrix of order n. The Ordinary Least Square (OLS) estimator of β is therefore expressed as:

The covariance matrix of  β

 is estimated as Cov (β ) =σ2(X'X)-1 . It is evident that both the estimator  β  and its covariance structure are highly dependent on the properties of the matrix X'X1.1 Ridge Regression

The Ridge regression (RR), originally introduced in 1970 by Hoerl and Kennard, was developed to address the issue of multicollinearity commonly encountered in engineering and other empirical data analyses. Their pioneering study revealed that the introduction of a positive ridge parameter ???? leads to a ridge regression estimator whose Mean Squared Error (MSE) is lower than the variance of the Ordinary Least Squares (OLS) estimator, thereby achieving greater estimation efficiency through an optimal bias–variance trade-off. Consequently, the ridge regression estimator (RRE) is defined as follow:

Where M= [Ip+kZ-1]-1

,k
 0, Z=X'X
, and Ip denotes an identity matrix of order p, This expression defines the ridge regression estimator. Since the matrix [Z+ k Ip] remains invertible for all k > 0, a unique solution for β(k) . The ridge estimator is inherently biased; however, for a positive ridge parameter k, it often achieves a smaller MSE than the OLS estimator. From equ.  (4), it follows that as k 0, β (k)→∞, β (k)0 . The parameter k often referred to as the ridge or biasing parameter, must be estimated from empirical data. In recent decades, considerable research efforts in the domains of multicollinearity and ridge regression estimation have focused on determining appropriate and efficient methods for estimating k. Numerous scholars have contributed to this line of inquiry, proposing various modified forms of ridge-type estimators. Notably, Hoerl and Kennard (1970) introduced the original ridge regression estimator, which was subsequently extended through the development of the Modified Ridge Regression (MRR) estimator, the Liu estimator (Liu, 1993), and more recently, the Kibria–Lukman estimator (Kibria and Lukman, 2020).

1.2 The Kibria Lukman Estimator

The newly formulated one-parameter estimator is obtained by optimizing the following objective function, designed to balance bias and variance in the estimation process:

Minimization of the objective function with respect to β  leads to the corresponding normal equations.

In this formulation, ???? represents a nonnegative constant. Solving the preceding equation with respect to ???? produces the explicit form of the proposed estimator as:

 and FK=IP-kZ-1
The proposed estimator, hereafter referred to as the Kibria–Lukman (KL) estimator, is denoted byβKL  and serves as a ridge-type modification of the conventional OLS estimator, and the biasing parameter k > (Kibria and Lukman 2020). As with any regression estimator, the determination of an appropriate biasing parameter in the recently developed Kibria–Lukman Estimator (KLE) is crucial for assessing its efficiency and overall performance. Over the years, several studies have proposed and examined various estimators for the ridge regression biasing parameter (k). Foundational contributions include those of Hoerl and Kennard (1970), Hoerl, Kennard, and Baldwin (1975), McDonald and Galarneau (1975), Lawless and Wang (1976), and Dempster, Schatzoff, and Wermuth (1977). Subsequent advancements were made by Gibbons (1981), Kibria (2003), Khalaf and Shukur (2005), Alkhamisi and Shukur (2008), Muniz and Kibria (2009), Muniz, Kibria, Mansson, and Shukur (2012), and Mansson, Shukur, and Kibria (2010). More recent developments include the works of Hefnawy and Farag (2013), Aslam (2014), and Arashi and Valizadeh (2015), Durogade (2016), Lukman and Ayinde (2017), Owolabi et al (2022), Kibra (2022), Adedoyin et al (2025) among others. Despite these extensive efforts, there has been limited discussion regarding the interplay between multicollinearity and the error variance (noise parameter) a situation in which a high degree of multicollinearity is often accompanied by inflated error variance. This challenge can substantially affect the performance of existing biasing parameter estimators. Therefore, in this study, we propose a new estimator for the biasing parameter (k) within the Kibria–Lukman Estimator (KLE) framework to effectively address this limitation.
  1. Statistical Methodology

2.1 Canonical Form

The canonical form of the model is:

Where A = XP and α= P’β. Here, P  is an orthogonal matrix such that

The OLS estimator of α  is:

The ridge estimator (RE) of α is:

Where 

 and k is the biasing parameter

Thus the MSE of the propose estimator can be written as:

Finally, the MSE of Kibria-Lukman estimator after using the above stated definitions can be written as:

Differentiating P with respect to k gives and setting (∂p(k)∂k)=0 , we obtain

The optimal value of k in (16) depends on the unknown parameter σ2 and α2 These two estimators are replaced with unbiased estimate.

2.2 Biasing Parameters

In accordance with the methodology introduced by Hoerl, Kennard, and Baldwin, the harmonic mean formulation corresponding to Equation (16) is defined as:

As proposed by Özkale and Kaciranlar, the minimum form of equation (16) is expressed as:

Where the λi are eigenvalues of the matrix X'X , αi  is the ith element of α , and           

We next examine the existing methods proposed in the literature for determining the value of k. Hoerl and Kennard (1970) recommended estimating k as (here denoted by kHK

Here, αmax2

represents the largest element of αi  . Hoerl and Kennard (1970) demonstrated that the estimator kHK  yields a smaller Mean Squared Error (MSE) compared to the Ordinary Least Squares (OLS).

Hoerl et al. (1975) defined the ridge parameter ???? (denoted here by kHKB

Lawless and Wang (1976) derived biasing parameter k to be (denoted here by kLW

Hocking, Speed and Lynn (1976) also derived shrinkage parameter k to be (denoted here by kHSL

Kibria (2003) suggested alternative biasing estimators for k derived from the geometric mean (GM) and median of  σ2/αiThese estimators are expressed as follows:

Based on modification of kHK, Khalaf and Shukur (2005) suggested k to be (denoted by  kKS

Where λmax is the maximum eigen value of the matrix X'X . Building on the work of Kibria (2003) and Khalaf and Shukur (2005), Alkhamisi, Khalaf, and Shukur (2006) proposed the following three estimators for k karith

 

Drawing upon the geometric mean and square-root methods proposed by Khalaf and Shukur (2005), Kibria (2003), and Alkhamisi et al. (2006), Muniz and Kibria (2009) developed seven new estimators for the ridge parameter k:

Following the square-root transformation methodology of Alkhamisi and Shukur (2008), Muniz et al. (2012) introduced five new estimators for the ridge parameter k:

Where,
Khalaf (2012), based on modification of kHK, proposed k to be (denoted by kGKK

Where λmax  and λmin  are the maximum and minimum eigenvalues of matrix X'X  respectively.

Nomura (1988) proposed estimating the ridge parameter k as (denoted by kHMO

3 The Proposed Biasing Estimator

Following the modification of Khalaf (2012), a new biasing parameter was proposed and defines as:

Following the work of ozkale and kaciranlar (2007), the maximum version and the median of (16) is proposed and define as:

Where λmax  and λmin are the maximum and minimum eigenvalues of  X'X respectively.

4. Simulation Study

The primary objective of this study is to evaluate and compare the performance of various ridge biasing parameter estimators, with the aim of recommending efficient and reliable options for practical applications. Since a purely theoretical comparison among these estimators is not feasible, a simulation study was conducted using R software. The design of the simulation experiment was structured around factors that are expected to influence the statistical properties of the estimators under consideration, as well as the evaluation criteria employed to assess their performance. Given that the degree of multicollinearity among the explanatory variables (X’s) plays a critical role in the behavior of ridge-type estimators, we adopted the data generation approach proposed by Kibria and Lukman (2020), Oladapo et al (2022,2023 and 2024), Idowu et al (2022 and 2023) and Owolabi et al (2022) where the explanatory variables were simulated using the following relationship:

Where zij represent independent standard normal pseudo-random numbers, and let γ denote the correlation between any two explanatory variables X, with values γ = 0.80, 0.90, 0.95, 0.99 for p = 5. These variables are standardized such that X’X and X’y are expressed in correlation forms. The n observations of y are generated according to the following equation:

Where the ei ~NIID (0, σ2). And β'β=1

  as in Lukman et al. (2021). The simulation was conducted with 5,000 replications, considering sample sizes of n=50 and n=100, and error standard deviations of σ =3.0, 5.0 and 10. In this table, the average values of k for the Kibria–Lukman estimators are reported, and the proportion of replications in which the KLS estimators yield a smaller Mean Squared Error (MSE) than the OLS estimator is indicated in parentheses. For comparison purposes, the estimated MSE is computed as follows:

RESULT

This section reports the findings of the Monte Carlo simulation, focusing on the comparative performance of different biasing parameters against the Ordinary Least Squares (OLS) estimator in terms of Mean Squared Error (MSE). The key outcomes are illustrated through graphical analyses, providing a visual summary of estimator efficiency. Furthermore, detailed quantitative results for the five most efficient biasing parameters are presented in Tables 1 and 2, respectively.

Table 1: MSE Comparison With Various Biasing Parameter When n=50

n

σ /γ

0.8

0.9

0.95

0.99

50

3

OLS

3.119261

OLS

6.396138

OLS

13.03209

OLS

72.35728

ridgekgk

0.506875

ridgekgk

0.841694

ridgekgk

1.633292

ridgekgk

9.289061

ridgekm9

0.900381

ridgekm9

1.334978

ridgekm5

3.296332

ridgeks

20.44684

kls_max

1.019445

ridgekm5

1.951883

ridgekm7

3.346671

ridgekhmo

21.02746

ridgekm5

1.233454

ridgekhmo

1.974866

ridgekm9

3.494403

ridgekm5

23.91627

ridgekhmo

1.245669

ridgekm7

2.048751

ridgeksmx

3.577529

ridgekm7

25.02527

5

0.8

0.9

0.95

0.99

OLS

8.744206

OLS

17.64085

OLS

36.5453

OLS

211.1493

ridgekgk

1.065551

ridgekgk

2.04342

ridgekgk

4.304672

ridgekgk

25.99921

ridgekm9

2.30903

ridgekm9

2.988793

ridgeksmx

8.997492

ridgeks

57.26265

ridgekhmo

2.607262

ridgeksmx

3.992631

ridgekm5

9.006038

ridgekhmo

62.22484

kls_max

2.943057

ridgekm5

4.783466

ridgekm9

9.006122

ridgekm4

71.38203

ridgekm3

3.146505

ridgekhmo

4.921552

ridgekm7

9.493686

kls_med

71.83467

10

0.8

0.9

0.95

0.99

OLS

34.99279

OLS

71.52686

OLS

147.5813

OLS

802.4952

ridgekgk

3.775002

ridgekgk

7.9108

ridgekgk

17.21413

ridgekgk

101.2598

ridgekm9

7.942291

ridgekm9

10.54621

ridgeksmx

35.78133

ridgeks

226.0707

ridgekhmo

9.244573

ridgeksmx

12.16536

ridgekm9

36.01294

ridgekhmo

234.7825

ridgeksmx

9.46138

ridgekm5

18.64363

ridgekm5

36.87991

ridgekm4

254.3872

ridgekm5

11.12184

ridgekhmo

18.89106

ridgeks

38.13841

ridgekm11

280.2993

Table 2: MSE Comparison With Various Biasing Parameter When   n=100

n

σ /γ

0.8

0.9

0.95

0.99

100

3

OLS

1.456806

OLS

2.979783

OLS

6.136638

OLS

34.60792

ridgekgk

0.333625

ridgekgk

0.461724

ridgekgk

0.819828

ridgekgk

4.445529

kls_max

0.562144

ridgekm9

0.84869

ridgekm9

1.185489

ridgekm6

8.503839

ridgekm9

0.725821

kls_max

0.943758

ridgekm5

1.797427

ridgekm4

8.539522

ridgekm3

0.757978

ridgekhmo

1.10419

kls_max

1.837256

ridgekm5

9.234336

ridgekm5

0.76469

ridgekm5

1.169561

ridgekm7

1.891241

ridgekm7

9.790814

5

0.8

0.9

0.95

0.99

OLS

4.005733

OLS

8.42795

OLS

17.53722

OLS

93.26741

ridgekgk

0.595486

ridgekgk

1.029015

ridgekgk

2.108074

ridgekgk

12.02214

kls_max

1.241737

ridgekm9

2.168397

ridgekm9

3.055222

ridgeks

21.83337

ridgekhmo

1.451163

ridgekhmo

2.424392

ridgeksmx

4.064918

ridgekm4

23.5622

ridgekm3

1.652956

kls_max

2.653076

ridgekm5

4.806325

ridgekm6

24.11049

kls_med

1.828434

ridgekm3

2.890744

ridgekm7

4.959914

ridgekm5

26.70134

10

0.8

0.9

0.95

0.99

OLS

16.13528

OLS

32.77341

OLS

68.87816

OLS

374.3429

ridgekgk

1.759601

ridgekgk

3.63438

ridgekgk

8.077204

ridgekgk

47.86255

ridgekhmo

4.277563

ridgekm9

7.48795

ridgekm9

10.74171

ridgeks

69.34554

ridgekm3

5.367357

ridgekhmo

8.538107

ridgeksmx

12.05585

ridgekm4

98.14063

kls_max

5.783906

ridgeksmx

8.886951

ridgekhmo

18.04907

ridgekm6

99.82201

kls_med

6.304446

ridgekm5

10.49928

ridgekm5

18.51706

ridgekhmo

106.8815

5.1 Performances with Respect To Sigma (σ)

Tables 1 and 2 present the Mean Squared Error (MSE) of the selected ridge biasing parameters as a function of σ, for sample sizes n=50,100 and correlation levels γ=0.8,0.9,0.95. The findings reveal that the MSE generally increases with higher values of σ. notably, all ridge-type estimators exhibit smaller MSEs compared to the Ordinary Least Squares (OLS) estimator, indicating improved estimation efficiency. Specifically, when σ=3, the Ridge_kgk estimator demonstrates superior performance relative to other biasing parameters in terms of lower MSE. A similar pattern is observed at σ=5 and σ=10 where Ridge_kgk consistently outperforms its counterparts. For clarity, Figure 1 illustrates the behavior of the estimators as a function of σ for γ=0.8 and n=50.

Figure 1: Performance as a function of sigma (σ) when n = 50 and γ = 0.8

5.2 Performance with Respect To The Correlation Coefficient (γ)

The Mean Squared Errors (MSEs) of the selected estimators were further examined as a function of the correlation coefficient (γ) for given values of n, σ, and p. To enhance interpretability, the performance of the biasing parameters as a function of γ is depicted in Figure 2. The findings reveal that an increase in the correlation among explanatory variables leads to a corresponding rise in the MSE of ridge-type estimators. Nevertheless, all ridge estimators maintain smaller MSEs compared to the Ordinary Least Squares (OLS) estimator, confirming their efficiency in handling multicollinearity. For relatively low correlation levels (e.g., γ=0.8), the Ridge_kgk estimator exhibits superior performance with the smallest MSE. A similar dominance of Ridge_kgk is observed for γ=0.9 and γ=0.95, based on the minimum MSE criterion. Furthermore, even at a high correlation level (γ=0.99), Ridge_kgk continues to outperform other biasing parameters, demonstrating its robustness under severe multicollinearity.

Figure 2: Performance as a function of correlation coefficient (γ) when n=50 and σ =3

5.3 Performances with Respect To Sample Size (n)

The Mean Squared Errors (MSEs) of the selected ridge biasing parameters were assessed as a function of the sample size (n) for fixed values of γ=0.8,0.9,0.95, and 0.99, with p=5 and σ=3,5, and 10. The results indicate a clear inverse relationship between sample size and MSE, demonstrating that estimator efficiency improves as n increases. Across all simulation design, the ridge-type estimators consistently outperformed the Ordinary Least Squares (OLS) estimator, achieving notably smaller MSEs. For smaller sample sizes (e.g., n=50), the Ridge_kgk estimator exhibited superior performance relative to other biasing parameters. Similarly, for larger sample sizes (e.g., n=100), Ridge_kgk maintained its dominance, yielding the minimum MSE among the compared estimators.

Figure 3: Performance as a function of sample size (n) when γ = 0.8 and σ = 3

CONCLUSION

Based on the outcomes of the simulation experiment, several key conclusions can be drawn. First, an increase in the error variance (σ) leads to a corresponding rise in the Mean Squared Error (MSE). Similarly, higher levels of multicollinearity, represented by larger values of the correlation coefficient (γ), also result in higher MSEs. Conversely, as the sample size (n) increases, the MSEs tend to decrease, even under conditions of strong multicollinearity and large error variance. Across all experimental configurations, the ridge-type estimators consistently outperformed the Ordinary Least Squares (OLS) estimator, achieving significantly smaller MSEs. Among the estimators considered, the proposed Ridge_kgk and the Ridge_km9 estimator of Muniz et al. (2012) exhibited superior performance, producing the lowest MSEs across most scenarios. Therefore, these estimators are recommended for empirical applications where multicollinearity is a concern. The findings further reinforce the theoretical advantage of incorporating optimally selected biasing parameters to enhance estimator stability and predictive accuracy in linear regression models.

ACKNOWLEDGEMENT

The Authors wish to express our gratitude and appreciation for the financial support received from TETFUND for this Institution Based Research (IBR).                                          

REFERENCE

  1. Adedoyin, M. A., Oladapo, O. J., & Adejumo, A. O. (2025). A modified biasing ridge estimator for addressing multicollinearity problem in linear regression model. Journal of the Royal Statistical Society, Nigeria Group, 2(1). https://publications.funaab.edu.ng/index.php/JRSS-NIG/article/view/1943
  2. Alkhamisi, M. & Shukur, G. (2008). Developing ridge parameters for SUR model. Communications in Statistics – Theory and Methods, 37(4), 544-564. doi: 10.1080/03610920701469152.
  3. Alkhamisi, M., Khalaf, G., & Shukur, G. (2006). Some modifications for choosing ridge parameters. Communications in Statistics – Theory and Methods, 35(11), 2005-2020. doi: 10.1080/03610920600762905.
  4. Arashi, M. & Valizadeh, T. (2015). Performance of Kibria’s methods in partial linear ridge regression model. Statistical Papers, 56(1), 231-246. doi:10.1007/s00362-014-0578-6.
  5. Aslam, M. (2014). Performance of Kibria's method for the heteroscedastic ridge regression model: Some Monte Carlo evidence. Communications in Statistics – Simulation and Computation. 43(4), 673-686. doi:10.1080/03610918.2012.712185.
  6. Dempster, A. P., Schatzoff, M., & Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. Journal of the American Statistical Association, 72(357), 77-91. doi: 10.1080/01621459.1977.10479910.
  7. Dorugade, A.  V.  (2016).    New Ridge Parameters for Ridge Regression.  Journal of  the  Association  of  Arab Universities for Basic and Applied Sciences, 1-6
  8. Gibbons, D. G. (1981). A simulation study of some ridge estimators. Journal of the American Statistical Association, 76(373), 131-139. doi: 10.1080/01621459.1981.10477619.
  9. Hefnawy, E. A. & Farag A. (2013). A combined nonlinear programming model and Kibria method for choosing ridge parameter regression. Communications in Statistics – Simulation and Computation, 43(6). doi:10.1080/03610918.2012.735317.
  10. Hocking, R. R., Speed, F. M., & Lynn, M. J. (1976). A class of biased estimators in linear regression. Technometrics, 18(4), 55-67. doi:10.1080/00401706.1976.10489474.
  11. Hoerl, A. E. & Kennard, R. W. (1970). Ridge regression: Biased estimation for non-orthogonal problems. Technometrics, 12(1), 55-67. doi:10.1080/00401706.1970.10488634.
  12. Hoerl, A. E., Kennard, R. W., & Baldwin, K. F. (1975). Ridge regression: Some simulations. Communications in Statistics, 4(2), 105-123. doi:10.1080/03610927508827232.
  13. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., & Ayinde, K. (2022). On the biased two-parameter estimator to combat multicollinearity in linear regression model. African Scientific Reports, 1(3), 188–204.
  14. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., Ayinde, K., & Akinmoju, O. (2023). Combating multicollinearity: A new two-parameter approach. Nicel Bilimler Dergisi, 5(2), 90–116. https://doi.org/10.51541/nicel.1084768
  15. Khalaf, G. (2012). A proposed ridge parameter to improve the least squares estimator. Journal of Modern Applied Statistical Methods, 11(2), 443-449. Khalaf, G. & Shukur, G. (2005). Choosing ridge parameters for regression problems. Communications in Statistics – Theory and Methods, 34(5), 1177-1182. doi: 10.1081/STA-200056836.
  16. Khalaf, G. and G. Shukur, 2005. Choosing ridge parameter for regression problem. Commun. Stat. Theory Methods, 34: 1177-1182.
  17. Kibria, B. M. G. (2003). Performance of some new ridge regression estimators. Communications in Statistics – Simulation and Computation, 32(2), 419-435. doi: 10.1081/SAC-120017499.
  18. Kibria, B.M.G., and Lukman, A.F. 2020. A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications. Hindawi, Scientifica Article ID 9758378: 16 pages.
  19. Kibria, B. M. G. (2022). More than hundred (100) estimators for estimating the shrinkage parameter in linear and generalized linear ridge regression models. Journal of Econometrics and Statistics, 2(2), 233–252.
  20. Lukman, A.  F.  and Ayinde, K.  (2017).  Review and Classifications of the Ridge Parameter   Estimation Techniques.  Haccetteppe Journal of Mathematics and Statistics, 46 (5), 953-967
  21. Lawless, J. F. & Wang, P. (1976). A simulation study of ridge and other regression estimators. Communications in Statistics – Theory and Methods, 5(4), 307-323. doi: 10.1080/03610927608827353.
  22. Mansson, K., Shukur, G. & Kibria, B. M. G. (2010). On some ridge regression estimators: A Monte Carlo simulation study under different error variances. Journal of Statistics, 17(1), 1-22.
  23. McDonald, G. C. & Galarneau, D. I. (1975). A Monte Carlo evaluation of ridge-type estimators. Journal of the American Statistical Association, 70(350), 407-416. doi: 10.1080/01621459.1975.10479882, Communications in Statistics – Simulation and Computation, 38(3), 621-630. doi: 10.1080/03610910802592838.
  24. Muniz, G. and B.G. Kibria, 2009. On some ridge regression estimators: An empirical comparison. Commun. Stat. Simul. Comput., 38: 621-630.
  25. Muniz, G., Kibria, B. M. G., Mansson, K., & Shukur, G. (2012). On developing ridge regression parameters: A graphical investigation. Statistics and Operations Research Transactions, 36(2), 115-138.
  26. Nomura, M. (1988). On the almost unbiased ridge regression estimation. Communication in Statistics – Simulation and Computation, 17(3), 729-743. doi:10.1080/03610918808812690
  27. Oladapo, O. J., Owolabi, A. T., Idowu, J. I., & Ayinde, K. (2022). A new modified Liu ridge-type estimator for the linear regression model: Simulation and application. International Journal of Clinical Biostatistics and Biometrics, 8(2).
  28. Oladapo, O. J., Idowu, J. I., Owolabi, A. T., & Ayinde, K. (2023). A new biased two-parameter estimator in linear regression model. EQUATIONS, 3, 73–92. https://doi.org/10.37394/232021.2023.3.10
  29. Oladapo, O. J., Alabi, O. O., & Ayinde, K. (2024). Another new two-parameter estimator in dealing with multicollinearity in the logistic regression model. International Journal of Mathematical Sciences and Optimization: Theory and Applications, 10(2), 22–35. https://doi.org/10.5281/zenodo.10937145
  30. Owolabi, A. T., Ayinde, K., Idowu, J. I., Oladapo, O. J., & Lukman, D. F. (2022). A New Two-Parameter Estimator in the Linear Regression Model with Correlated Regressors. Journal of Statistics Applications & Probability, 11(2), 499-512. http://dx.doi.org/10.18576/jsap/110211.

Reference

  1. Adedoyin, M. A., Oladapo, O. J., & Adejumo, A. O. (2025). A modified biasing ridge estimator for addressing multicollinearity problem in linear regression model. Journal of the Royal Statistical Society, Nigeria Group, 2(1). https://publications.funaab.edu.ng/index.php/JRSS-NIG/article/view/1943
  2. Alkhamisi, M. & Shukur, G. (2008). Developing ridge parameters for SUR model. Communications in Statistics – Theory and Methods, 37(4), 544-564. doi: 10.1080/03610920701469152.
  3. Alkhamisi, M., Khalaf, G., & Shukur, G. (2006). Some modifications for choosing ridge parameters. Communications in Statistics – Theory and Methods, 35(11), 2005-2020. doi: 10.1080/03610920600762905.
  4. Arashi, M. & Valizadeh, T. (2015). Performance of Kibria’s methods in partial linear ridge regression model. Statistical Papers, 56(1), 231-246. doi:10.1007/s00362-014-0578-6.
  5. Aslam, M. (2014). Performance of Kibria's method for the heteroscedastic ridge regression model: Some Monte Carlo evidence. Communications in Statistics – Simulation and Computation. 43(4), 673-686. doi:10.1080/03610918.2012.712185.
  6. Dempster, A. P., Schatzoff, M., & Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. Journal of the American Statistical Association, 72(357), 77-91. doi: 10.1080/01621459.1977.10479910.
  7. Dorugade, A.  V.  (2016).    New Ridge Parameters for Ridge Regression.  Journal of  the  Association  of  Arab Universities for Basic and Applied Sciences, 1-6
  8. Gibbons, D. G. (1981). A simulation study of some ridge estimators. Journal of the American Statistical Association, 76(373), 131-139. doi: 10.1080/01621459.1981.10477619.
  9. Hefnawy, E. A. & Farag A. (2013). A combined nonlinear programming model and Kibria method for choosing ridge parameter regression. Communications in Statistics – Simulation and Computation, 43(6). doi:10.1080/03610918.2012.735317.
  10. Hocking, R. R., Speed, F. M., & Lynn, M. J. (1976). A class of biased estimators in linear regression. Technometrics, 18(4), 55-67. doi:10.1080/00401706.1976.10489474.
  11. Hoerl, A. E. & Kennard, R. W. (1970). Ridge regression: Biased estimation for non-orthogonal problems. Technometrics, 12(1), 55-67. doi:10.1080/00401706.1970.10488634.
  12. Hoerl, A. E., Kennard, R. W., & Baldwin, K. F. (1975). Ridge regression: Some simulations. Communications in Statistics, 4(2), 105-123. doi:10.1080/03610927508827232.
  13. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., & Ayinde, K. (2022). On the biased two-parameter estimator to combat multicollinearity in linear regression model. African Scientific Reports, 1(3), 188–204.
  14. Idowu, J. I., Oladapo, O. J., Owolabi, A. T., Ayinde, K., & Akinmoju, O. (2023). Combating multicollinearity: A new two-parameter approach. Nicel Bilimler Dergisi, 5(2), 90–116. https://doi.org/10.51541/nicel.1084768
  15. Khalaf, G. (2012). A proposed ridge parameter to improve the least squares estimator. Journal of Modern Applied Statistical Methods, 11(2), 443-449. Khalaf, G. & Shukur, G. (2005). Choosing ridge parameters for regression problems. Communications in Statistics – Theory and Methods, 34(5), 1177-1182. doi: 10.1081/STA-200056836.
  16. Khalaf, G. and G. Shukur, 2005. Choosing ridge parameter for regression problem. Commun. Stat. Theory Methods, 34: 1177-1182.
  17. Kibria, B. M. G. (2003). Performance of some new ridge regression estimators. Communications in Statistics – Simulation and Computation, 32(2), 419-435. doi: 10.1081/SAC-120017499.
  18. Kibria, B.M.G., and Lukman, A.F. 2020. A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications. Hindawi, Scientifica Article ID 9758378: 16 pages.
  19. Kibria, B. M. G. (2022). More than hundred (100) estimators for estimating the shrinkage parameter in linear and generalized linear ridge regression models. Journal of Econometrics and Statistics, 2(2), 233–252.
  20. Lukman, A.  F.  and Ayinde, K.  (2017).  Review and Classifications of the Ridge Parameter   Estimation Techniques.  Haccetteppe Journal of Mathematics and Statistics, 46 (5), 953-967
  21. Lawless, J. F. & Wang, P. (1976). A simulation study of ridge and other regression estimators. Communications in Statistics – Theory and Methods, 5(4), 307-323. doi: 10.1080/03610927608827353.
  22. Mansson, K., Shukur, G. & Kibria, B. M. G. (2010). On some ridge regression estimators: A Monte Carlo simulation study under different error variances. Journal of Statistics, 17(1), 1-22.
  23. McDonald, G. C. & Galarneau, D. I. (1975). A Monte Carlo evaluation of ridge-type estimators. Journal of the American Statistical Association, 70(350), 407-416. doi: 10.1080/01621459.1975.10479882, Communications in Statistics – Simulation and Computation, 38(3), 621-630. doi: 10.1080/03610910802592838.
  24. Muniz, G. and B.G. Kibria, 2009. On some ridge regression estimators: An empirical comparison. Commun. Stat. Simul. Comput., 38: 621-630.
  25. Muniz, G., Kibria, B. M. G., Mansson, K., & Shukur, G. (2012). On developing ridge regression parameters: A graphical investigation. Statistics and Operations Research Transactions, 36(2), 115-138.
  26. Nomura, M. (1988). On the almost unbiased ridge regression estimation. Communication in Statistics – Simulation and Computation, 17(3), 729-743. doi:10.1080/03610918808812690
  27. Oladapo, O. J., Owolabi, A. T., Idowu, J. I., & Ayinde, K. (2022). A new modified Liu ridge-type estimator for the linear regression model: Simulation and application. International Journal of Clinical Biostatistics and Biometrics, 8(2).
  28. Oladapo, O. J., Idowu, J. I., Owolabi, A. T., & Ayinde, K. (2023). A new biased two-parameter estimator in linear regression model. EQUATIONS, 3, 73–92. https://doi.org/10.37394/232021.2023.3.10
  29. Oladapo, O. J., Alabi, O. O., & Ayinde, K. (2024). Another new two-parameter estimator in dealing with multicollinearity in the logistic regression model. International Journal of Mathematical Sciences and Optimization: Theory and Applications, 10(2), 22–35. https://doi.org/10.5281/zenodo.10937145
  30. Owolabi, A. T., Ayinde, K., Idowu, J. I., Oladapo, O. J., & Lukman, D. F. (2022). A New Two-Parameter Estimator in the Linear Regression Model with Correlated Regressors. Journal of Statistics Applications & Probability, 11(2), 499-512. http://dx.doi.org/10.18576/jsap/110211.

Photo
Raheed Saheed Lekan
Corresponding author

Department of Statistics, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Photo
Owolabi Muhammed Ishola
Co-author

Department of Statistics, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Photo
James Olasunkanmi Oladapo
Co-author

Department of Statistics, Faculty of Science, Ladoke Akintola University of Technology, Ogbomoso, Tate, Nigeria

Photo
Olabode John Oluwasina
Co-author

Department of General Studies, School of Management, Federal Polytechnic Ayede, Oyo State Nigeria

Photo
Fawolu Oluseyi Ajayi
Co-author

Department of General Studies, School of Management, Federal Polytechnic Ayede, Oyo State Nigeria

Photo
Teliat Rasheed Olusanjo
Co-author

Department of Science Laboratory Technology, School of Science and Technology, Federal Polytechnic, Ayede, Oyo State, Nigeria

Raheed Saheed Lekan*, Owolabi Muhammed Ishola, James Olasunkanmi Oladapo, Olabode John Oluwasina, Fawolu Oluseyi Ajayi, Teliat Rasheed Olusanjo, Some Ridge Biasing Parameter for Linear Regression Model and Their Performances on Kibria-Lukman Estimator, Int. J. Sci. R. Tech., 2025, 2 (12), 14-23. https://doi.org/10.5281/zenodo.18118913

More related articles
Formulation Approaches and Evaluation Parameters i...
Farhan Bilal Shaikh, Pratiksha Gore, ...
A Comprehensive Review of SHIELD: Smart Handler fo...
Minal Pardey, Anuja Bule, Divyani Yadav, Janavi Kande, Divya Jadh...
Development And In Vitro Evaluation of Tablet in C...
S. Tamil Alagan, L. Gopi, Dr. V. Kalvimoorthi, ...
Related Articles
Design of Experiments in the Formulation and Optimization of Sustained Release M...
Kartik Shinde, Dr. Nilesh Gorde, Swapnil Phalak, Prajval Birajdar, Vishal Bodke, ...
A Unified Video Content Understanding Framework for Youtube and Local Videos wit...
M. Manjunath, M. Shashank, Sai Gagan Tej K. B. , C. Sharath Vamshi, Srisailanath, ...
Formulation and Evaluation of Poly Herbal Face Pack...
Harshal Mahajan, Satyashila Mhaske, Dr. G. R. Sitaphale, Dr. P. R. Laddha, Dr. P. R. Tathe, ...
More related articles
A Comprehensive Review of SHIELD: Smart Handler for Incident Event and Location ...
Minal Pardey, Anuja Bule, Divyani Yadav, Janavi Kande, Divya Jadhav, Srushti Pardhi, Ayush Bankar, ...
A Comprehensive Review of SHIELD: Smart Handler for Incident Event and Location ...
Minal Pardey, Anuja Bule, Divyani Yadav, Janavi Kande, Divya Jadhav, Srushti Pardhi, Ayush Bankar, ...