Showing 71 - 80 of 153
The Nadaraya-Watson estimator of regression is known to be highly sensitive to the presence of outliers in the sample. A possible way of robustication consists in using local L-estimates of regression. Whereas the local L-estimation is traditionally done using an empirical conditional...
Persistent link: https://www.econbiz.de/10010983558
Persistent link: https://www.econbiz.de/10010983560
Classical parametric estimation methods applied to nonlinear regression and limited-dependent-variable models are very sensitive to misspecification and data errors. On the other hand, semiparametric and nonparametric methods, which are not restricted by parametric assumptions, require more data...
Persistent link: https://www.econbiz.de/10010983572
Persistent link: https://www.econbiz.de/10010983774
Persistent link: https://www.econbiz.de/10010983816
Most dimension reduction methods based on nonparametric smoothing are highly sensitive to outliers and to data coming from heavy tailed distributions. We show that the recently proposed MAVE and OPG methods by Xia et al. (2002) allow us to make them robust in a relatively straightforward way...
Persistent link: https://www.econbiz.de/10010983843
This paper offers a new method for estimation and forecasting of the linear and nonlinear time series when the stationarity assumption is violated. Our general local parametric approach particularly applies to general varying-coefficient parametric models, such as AR or GARCH, whose coefficients...
Persistent link: https://www.econbiz.de/10010274136
We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables Y = (Y1, . . . , Yl), l ∈ N, which are explained by a model, and independent (exogenous, explanatory) variables X = (X1, . . . ,Xp), p ∈ N,...
Persistent link: https://www.econbiz.de/10010296407
Many methods of computational statistics lead to matrix-algebra or numerical- mathematics problems. For example, the least squares method in linear regression reduces to solving a system of linear equations. The principal components method is based on finding eigenvalues and eigenvectors of a...
Persistent link: https://www.econbiz.de/10010296419
Most dimension reduction methods based on nonparametric smoothing are highly sensitive to outliers and to data coming from heavy tailed distributions. We show that the recently proposed MAVE and OPG methods by Xia et al. (2002) allow us to make them robust in a relatively straightforward way...
Persistent link: https://www.econbiz.de/10010296438