Summary: In this paper we present explanation on the phenomenon pointed out in Cook and Manning (2002) on the unusual behaviour of the Dickey-Fuller test in the presence of trend misspecification. It appears that the rejection frequency of the unit root tests in the presence of trend misspecification is very sensitive to the number of the initial observations that need to be discarded. Based on the evidence from the Monte Carlo simulations, we show that for the DGP in Cook and Manning (2002), the unusual behaviour of the Dickey-Fuller test disappears as the number of the discarded initial observations becomes sufficiently large.
Questions? LIVE CHAT