The American Standard Code for Information Interchange (ASCII) has 128 binary-coded characters. If a certain computer generates 100,000 characters/second, determine the following:
a. The number of bits (binary digits) requiredper character.
b. The number of bits/second required to transmit the computer output, and the minimum bandwidth required to transmit the signal.
c. For single-error detection capability, an additional bit (parity bit) is added to the code of each character. Modify your answers in parts (a) and (b) in view of this information.