Design a ternary Huffman code, using 0, 1, and 2 as letters, for a source with output alphabet probabilities given by {0.05, 0.1, 0.15, 0.17, 0.18, 0.22, 0.13}. What is the resulting average codeword length? Compare the average codeword length with the entropy of the source. (In what base would you compute the logarithms in the expression for the entropy for a meaningful comparison?)