Bootstrap Estimates of the Variances and Biases of Selected Robust Estimates of the Parameters of a Linear Model

Dennis A. Tarepe, Nestor C. Racho, Rhegie M. Caga-Anan


When we want to apply the general linear model to a set of data, we have various methods to choose from to estimate the model parameters.  The most popular one is the method of least-squares.  This method, however, has weaknesses.  Alternative regression methods are available which restrain the influence of outlying data points.  The least-squares method of regression performs best if the population of errors is normally distributed.  If there is a reason to believe that the distribution of errors may not be normal, then least-squares estimates and tests may be much less efficient than those provided by robust alternative methods such as the least-absolute-deviations (LAD), or M-estimators.  Moreover, when the sample size is fixed, and no additional data can be obtained to support normal approximation, then the “bootstrap method†(Davison, 1997) may be used.  The purpose of this paper is to explore the use of the bootstrap method in estimating the variances and biases of selected robust estimates of the parameters of a linear model.  Specifically, three robust estimators are considered. Generally, the simulation results reveal that the least-squares method still performs quite well for slight contamination while the robust methods perform better for moderate and high contaminations.

 

Keywords: Linear models, Robust alternatives, Bootstrap, Jackknife, Simulation


Full Text: PDF