Demonstrate the properties of symmetry, nonnegativity, and expansion of the mutual information I(X;Y) described in Section.
Consider the continuous random variable Y, defined by
Y = X + N
where the random variables X and N are statistically independent. Show that the conditional differential entropy of Y, given X, equals
h(Y|X) = h(N)
where h(N) is the differential entropy of N.