Solve the following problem:
A discrete memory less source has an alphabet of size 7, X = {x1, x2, x3, x4, x5, x6, x7}, with corresponding probabilities {0.02, 0.11, 0.07, 0.21, 0.15, 0.19, 0.25}.
1. Determine the entropy of this source.
2. Design a Huffman code for this source, and find the average codeword length of the Huffman code.
3. A new source Y = {y1, y2, y3} is obtained by grouping the outputs of the source X as
y1={x1,x2,x5}
y2={x3,x7}
y3={x4,x6}
Determine the entropy of Y .
4. Which source is more predictable, X or Y ? Why?