Suppose time-series data has been generated according to the following process:
yt = α+ Φyt-1 + ut, ut = εt + ρεt-1 ,
where εt is independent white noise. Our main interest is consistent estimation of Φ from realizations on yt.
1) Provide conditions for this process to be stationary.
2) From hereonout, assume the process is stationary. Will OLS generally provide you with consistent point estimates of Φ? Can you give conditions under which it will? Provide the asymptotic distribution of OLS under these assumptions.
3) ARMA processes are generally estimated by ML. Do you have enough information to set up the (marginal or conditional) likelihood function?
4) Show that
E[yt-j(yt - α - Φyt-1)] = 0; j ≥ 2,
holds.
5) Use the moment conditions under Question 4 to derive a consistent GMM estimator of (α,Φ). Does it require knowledge of the distribution of εt? Note that we discussed asymptotic properties of GMM estimators assuming a fixed amount of moments.
6) Given your answer to the previous question, derive a lower bound on the variance of your estimator.
7) Suppose that you are uncertain about the model specification above. Moreover, you fear that ut might have the structure
ut = εt + ρεt-1 + δt-2;
which is an MA(2) if δ ≠ 0. Can your estimation framework from above (Questions 5 and 6) be used to test the null hypothesis that δ = 0? If yes, show how.
8) To convince yourself that your estimator is consistent, try it on simulated data (use Matlab or R, say). You can experiment with the number of moments to investigate the effect on the bias and the coverage of confidence intervals.