Showing 1 - 10 of 59
Robust versions of the exponential and Holt-Winters smoothing method for forecasting are presented. They are suitable for forecasting univariate time series in the presence of outliers. The robust exponential and Holt-Winters smoothing methods are presented as recursive updating schemes that...
Persistent link: https://www.econbiz.de/10008528946
Persistent link: https://www.econbiz.de/10005428739
The minimum covariance determinant (MCD) scatter estimator is a highly robust estimator for the dispersion matrix of a multivariate, elliptically symmetric distribution. It is relatively fast to compute and intuitively appealing. In this note we derive its influence function and compute the...
Persistent link: https://www.econbiz.de/10005152940
Li and Chen (J. Amer. Statist. Assoc. 80 (1985) 759) proposed a method for principal components using projection-pursuit techniques. In classical principal components one searches for directions with maximal variance, and their approach consists of replacing this variance by a robust scale...
Persistent link: https://www.econbiz.de/10005153034
In this paper it is studied how observations in the training sample affect the misclassification probability of a quadratic discriminant rule. An approach based on partial influence functions is followed. It allows to quantify the effect of observations in the training sample on the performance...
Persistent link: https://www.econbiz.de/10005106973
Persistent link: https://www.econbiz.de/10005117960
Persistent link: https://www.econbiz.de/10005117989
Our aim is to construct a factor analysis method that can resist the effect of outliers. For this we start with a highly robust initial covariance estimator, after which the factors can be obtained from maximum likelihood or from principal factor analysis (PFA). We find that PFA based on the...
Persistent link: https://www.econbiz.de/10005221368
It is well-known that k-step M-estimators can yield a high efficiency without losing the breakdown point of the initial estimator. In this note we derive their bias curves. In the location framework the bias increases only slightly with k, but in the scale case the bias curves change considerably.
Persistent link: https://www.econbiz.de/10005254434
In this note we discuss the breakdown behavior of the maximum likelihood (ML) estimator in the logistic regression model. We formally prove that the ML-estimator never explodes to infinity, but rather breaks down to zero when adding severe outliers to a data set. An example confirms this behavior.
Persistent link: https://www.econbiz.de/10005254635