Modified Two-Parameter Ridge Estimators for Enhanced Regression Performance in the Presence of Multicollinearity: Simulations and Medical Data Applications
Predictive regression models often face a common challenge known as multicollinearity. This phenomenon can distort the results, causing models to overfit and produce unreliable coefficient estimates. Ridge regression is a widely used approach that incorporates a regularization term to stabilize para...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-07-01
|
| Series: | Axioms |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2075-1680/14/7/527 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Predictive regression models often face a common challenge known as multicollinearity. This phenomenon can distort the results, causing models to overfit and produce unreliable coefficient estimates. Ridge regression is a widely used approach that incorporates a regularization term to stabilize parameter estimates and improve the prediction accuracy. In this study, we introduce four newly modified ridge estimators, referred to as RIRE1, RIRE2, RIRE3, and RIRE4, that are aimed at tackling severe multicollinearity more effectively than ordinary least squares (OLS) and other existing estimators under both normal and non-normal error distributions. The ridge estimators are biased, so their efficiency cannot be judged by variance alone; instead, we use the mean squared error (MSE) to compare their performance. Each new estimator depends on two shrinkage parameters, <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>k</mi></mrow></semantics></math></inline-formula> and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>d</mi></mrow></semantics></math></inline-formula>, making the theoretical analysis complex. To address this, we employ Monte Carlo simulations to rigorously evaluate and compare these new estimators with OLS and other existing ridge estimators. Our simulations show that the proposed estimators consistently minimize the MSE better than OLS and other ridge estimators, particularly in datasets with strong multicollinearity and large error variances. We further validate their practical value through applications using two real-world datasets, demonstrating both their robustness and theoretical alignment. |
|---|---|
| ISSN: | 2075-1680 |