Response to the following problem:
The output of a DMS consists of the possible letters x1, x2,..., xn, which occur with probabilities p1, p2,..., pn, respectively. Prove that the entropy H(X) of the source is at most log n. Find the probability density function for which H(X) = log n.
Make sure you use enough details to support your answer.