Variance estimation for nonparametric regression and its applications
Traditionally, non-parametric regression research has been centered on the mean estimation problem. As a rule, the variance is presumed to be an unknown constant and then one of several standard estimators is proposed to estimate it. There are reasons why this approach is often not completely satisfactory. To begin with, the homoscedasticity assumption is often not a viable option. Also, this approach fails to take into account that many applications, such as confidence interval or prediction interval construction, require us to have precise enough local variance estimators (e.g. estimators with the minimal mean integrated squared error (MISE)). This dissertation presents a class of simple difference-based kernel estimators for the local variance function of a linear non-parametric regression model. It is shown that, in the optimal case, its MISE has the convergence rate of the order -4/5 with the constant depending on the kernel and the order of the difference r. Another important feature is that the bias component is independent of the order of the difference. The resulting class of estimators enjoys properties superior to those exhibited by another estimator class proposed as a solution of the problem in [23]. Also, a sketch of possible bandwidth selection procedures is proposed.