Consider the set of 7 symbols, X1, X2,...,X7 with the probabilities 1/12, 1/12, 1/12, 1/8, 1/8, 1/4, 1/4.
(i) What is the entropy of this symbol set?
(ii) If we were to use the same number of bits for all symbols in this set, how many would we need for each symbol?
In the parts below, assume we are using a compression method that takes advantage of the relative frequency of each symbol, such as the Huffman coding.
(iii) Compute the maximum compression ratio expected in an average text containing these 7 symbols:
(iv) Compute the maximum percent that would be saved in storage for an average text containing these 7 symbols: