The voltage output from a 0–50 psi pressure transducer will vary from 0–5 mV. The input/output relationship is linear. The signal will be recorded with a 12-bit A/D converter with a 0-10 V input range. The voltage output needs to be amplified before it is converted to a digital signal.
i) What is the gain required to make the digital resolution Q of the signal that is recorded by the computer equal to a change in one psi in the transducer?
ii) What is the gain required for the same situation if an 14-bit A/D converter is used?