One error only:-
Consider a source X that has two symbols. These are coded in binary as 00 and 11 respectively. Each X symbol (two-bit word) has probability 1/2. When these two-bit words are sent through a certain channel, there is a probability p that one of the two bits will be changed by an error to the opposite binary bit before it is received at Y.
For example, 00 may be received as 00 with probability 1 - p, 01 with probability p/2, or 10 with probability p/2. There is no probability of two errors in a word. (Throughout the exercise use the notation H( p) = -p log p - (1 - p) log (1 - p).)
(a) Find H(Y|X).
(b) Find H(Y).
(c) What is the mutual information I(X; Y)?