Suppose the counting rate is 35 counts per minute. How long do you have to count in order to reach a percent uncertainty of 5%? Suppose there is a room background of 21 counts per minute. If you count the signal and background rate for 10 minutes each, what is the percent error of the net counting rate?