A source emits seven messages with probabilities 1/2, 1/4, 1/16, 1/32, 1/64, 1/128, and 1/128 respectively.
Find entropy of the source.
Encode this source using tartary Huffman coding, that is, encoding it using three symbols, say ‘0', ‘1', and ‘2'
What is the average length of the codeword.