Channel capacity formula for symmetric channel


1) Describe the importance of entropy H(X/Y) of a communication system where X is a transmitter and Y is a receiver.

2) The vent has six possible outcomes with probabilities 1/2.1/4,1/8,1/16,1/32,1/32. Determine the entropy of the system.

3) Explain Source coding theorem in detail, write down the advantage and disadvantage of channel coding and explain the data compaction.

4) Describe the properties of entropy and with appropriate example; describe the entropy of binary memory less source.

5) Five symbols of alphabet of discrete memory less source and their probabilities are given below. S=[S0,S1,S2,S3]; P[S]=[.4,.2,.2,.1,.1].Encode symbols by using Huffman coding.

6) Write detailed notes on Differential entropy. Deduce the channel capacity theorem and explain the implications of information capacity theorem.

7) What do you understand by binary symmetric channel? Deduce channel capacity formula for symmetric channel.

8) Construct binary optical code for probability symbols given below by using Huffman procedure and compute entropy of source, average code Length, efficiency, redundancy and variance? 0.2, 0.18, 0.12, 0.1, 0.1, 0.08, 0.06, 0.06, 0.06, 0.04.

9) Specify and prove continuous channel capacity theorem.

10) Encode source given below by using Shannon-Fano and Huffman coding procedures. Compare the results.

X         X1   X2     X3    X4      X5

P(X)   0.3   0.1    0.4   0.08   0.12

11)a) Encode source given below byusing Shannon-Fano coding:

X       X1     X2    X3     X4     X5      X6      X7

P(X)  0.4   0.2   0.12  0.08  0.08  0.08   0.04

(b) Describe Huffman coding algorithm with suitable exampleand compare this with other types of coding in detail.

Request for Solution File

Ask an Expert for Answer!!
Electrical Engineering: Channel capacity formula for symmetric channel
Reference No:- TGS011362

Expected delivery within 24 Hours