A large deviation result for the least squares estimators in nonlinear regression
We give a law of large deviations (LLD) for LS estimator [theta] in a nonlinear regression model with dependent errors, i.e., an exponential inequality for the probability of a large deviation of [theta] from the true [theta], the LLD is as nice as in Sieders and Dzhaparidze (1987) which has independent errors. This generalizes the results in Sieders and Dzhaparidze (1987) and Prakasa Rao (1984).