1. Consider the linear regression model
y = Xβ + u,
where y is an n x 1 vector of observations on the dependent variable, X is an n x k matrix of observations on k non-stochastic explanatory variables, β = (β1,...................,βk) is a k x 1 vector of unknown coefficients, u is an n x 1 vector of unobservable random disturbances such that E(u) = 0 and E(uu') = σ2In, σ2 is an unknown positive constant, and In is the it n x n identity matrix. Let β' = (X'/X)-1X'y be the ordinary least squares (OLS) estimator of β.
(a) What assumption about X guarantees the existence of β'? Give an example where this assumption fails.
(b) Define the OLS residuals as u^ = y XB^. Show that X'u^ = 0.
(c) Show that β^ is an unbiased estimator of β and obtain its variance-covariance matrix. How can the standard errors of the elements of β^ be estimated?
(d) Let β = Cy be an unbiased estimator of β, where C is a non-stochastic k x it matrix. Is there a matrix C for which le is a more efficient estimator than β? Explain.
(e) How do your answers to (c) and (d) change if E(uu') = σ2Ω, where Ω is a symmetric and positive definite it n x n matrix (with Ω ≠ In)? Explain in detail.