Apply the Huffman algorithm to the plaintext source S that generates the symbols a, b, c, d, e, f , g, and h independently with probabilities 1/2, resp. 1/4, 1/8, 1/16 1/32, 1/64, 1/128 and 1/128. What is the expected number of bits needed for the encoding of one letter? Compare this with the entropy of the source.