Question: The Analog to Digital converter has V(t) as its input, and it outputs a binary word B(t) with a fixed length of k bits, which is its best approximation to V(t). Suppose that V(t) can vary continuously between zero and 5 volts. The ADC is required to measure V(t) with an accuracy equivalent to two decimal places. In other words, the difference between V(t1) and B(t1) at any time t1, which is the ADC's measurement error, should never be more than five-thousandths of a volt, or 5 millivolts. What is the smallest value of k needed to achieve this level of accuracy? Briefly, but clearly, explain your reasoning.
Can you provide the answer as soon as possible and there is no word limit?