Question 1. Let Y = X1 + X2 + .. . + XN where N =d Bi(n, p), Xi, =d Bi(m, q) and N, X1, X2..... are independent.
(a) Find PY|N (z), the conditional probability generating function of Y given N , and state the values of z for which it is defined.
(b) Find PY(z), the probability generating function of Y, and state the values of z for which it is defined.
(c) Using PY(z), evaluate E(Y )
Question 2. Let X, Y , and Z be independent and uniformly distributed random variables on the interval (0, 1).
(a) Let U = X + Y . Find fU(u), the probability density function of U, and state the values of u for which it is defined.
(b) Find MU(t), the moment generating function of U, and state the values of t for which it is defined.
(c) Using MU(t), evaluate E(U).
(d) Let V = X + Y + Z. Find fV(v), the probability density function of V, and state the values of v for which it is defined.
(e) Find MV(t), the moment generating function of V , and state the values of t for which it is defined.
(f) For i = 1, 2,. .., 1000, let Xi be uniformly distributed random variables on the interval (0, 1). Let W = X1 + X2 + .. . + X1000. Use the central limit theorem to approximate P (480 < W < 510).
Question 3. Let X and Y have joint probability density function f given by
2(x + y), 0 ≤ y ≤ x ≤ 1
f (x, y) =
0, otherwise.
(a) Find fX|Y (x|y), the conditional density function of X given Y.
(b) Evaluate E(X|Y ), the conditional expectation of X given Y.
(c) Verify that E(X) = E(E(X|Y )).
Question 4. Let X =d N (µ, σ2) and Y = eX.
(a) Using the approximation formulae on Slides 367 and 368, derive expressions for E(Y ) and V (Y ) in terms of µ and o.
(b) Derive the exact expressions for E(Y ) and V (Y ) in terms of µ and σ.
(c) Let X =d N (0, 1). Compare the approximated and exact values for E(Y ) and V (Y ).