a) Consider a bivariate case: yi=B0+B1xi+ui
Using the fact that B1 (OLS) is unbiased estimator of B1 show that also B0 (OLS) is unbiased estimator of B0;
b) Suppose that you have to fit with an OLS regression the following four points on the X-Y plane: (0; 0), (0; 1), (1; 0) and (1; 1). You would do that by minimizing a sum of squared errors. What would be the values of ^ and B1 and B0? Suppose now that instead of minimizing sum of squared errors you minimize the sum of absolute values of errors. Would there still be a unique solution?