1. Consider the following two estimated least squares regressions, with standard errors in parentheses:
At =
|
26.19
|
+0.6248B1
|
-0.4398A
|
|
(2.73)
|
(0.006)
|
(0.074)
|
R2 =
|
0.999
|
|
|
(A/B)t =
|
25.921/Bk
|
+0.6246
|
-0.4315t
|
|
(2.22)
|
(0.007)
|
(0.060)
|
R2 =
|
0.875
|
|
|
(a) Assuming that the second equation has been estimated to remove heteroscedasticity, what assumption has been made about the error variance in first equation ?
(b) Assuming that there is indeed heteroscedasticity, provide the properties possessed by the LS estimator that was used to estimate first equation.
(c) Assuming that there is indeed heteroscedasticity as modelled, provide the properties possessed by the IS estimator that was used to estimate second equation.
(d) Describe the steps you would use to undertake White's test for homoscedasticity for first equation.
2. Consider the following partitioned regression model: where is the variance covariance matrix of the error terms. Find the formula for the GLS estimator and suggest a feasible GLS estimator.
3. Consider the simple linear regression model yt = β0 + β1x1 + ∈t where the errors exhibit second-order autocorrelation: ∈t = ρ2∈t-2 + μt
(a) Calculate the autocovariance function (y(s) = cou(∈t, ∈t-s )) under the assumption that et is a weakly covariance stationary process
(b) Assuming ρ2 is known, derive the GLS estimator of β1.
(c) Describe how one would calculate an asymptotically efficient two-step estimator of β1 if ρ2 were unknown.