1) Describe the importance of entropy H(X/Y) of a communication system where X is the transmitter and Y is the receiver.
2) An event has six possible outcomes with probabilities 1/2.1/4,1/8,1/16,1/32,1/32. Determine the entropy of the system.
3) Explain Source coding theorem, Write down the advantage and disadvantage of channel coding in detail, and explain the data compaction in detail.
4) Describe the properties of entropy and with appropriate example; describe the entropy of binary memory less source in detail.
5) Five symbols of alphabet of discrete memory less source and their probabilities are given below. S=[S0,S1,S2,S3]; P[S]=[.4,.2,.2,.1,.1]. Encode symbols by using Huffman coding.
6) Write brief notes on Differential entropy, derive channel capacity theorem and explain the implications of information capacity theorem.
7) What do you understand by binary symmetric channel? Derive channel capacity formula for symmetric channel.
8) Construct binary optical code for following probability symbols using Huffman procedure and compute entropy of source, average code Length, efficiency, redundancy and variance? 0.2, 0.18, 0.12, 0.1, 0.1, 0.08, 0.06, 0.06, 0.06, 0.04
9) Mention and prove continuous channel capacity theorem.
10) Encode following source by using Shannon-Fano and Huffman coding procedures. Compare the results.
X X1 X2 X3 X4 X5
P(X) 0.3 0.1 0.4 0.08 0.12
11)a) By using Shannon-Fano coding encode following source:
X X1 X2 X3 X4 X5 X6 X7
P(X) 0.4 0.2 0.12 0.08 0.08 0.08 0.04
(b) Describe in detail Huffman coding algorithm and compare this with other kinds of coding.