The thermometer in this problem: (A thermometer has a time constant of 10s and behaves as a first-order system. It is initially at a temperature of 30°C and then suddenly subjected to a surrounding temperature of 120°C. Calculate the 90 percent rise time and the time to attain 99 percent of the steady-state temperature.) is subjected to a harmonic temperature variation having an amplitude of 20°C and a frequency of 0.01Hz. Determine the phase lag of the thermometer and the amplitude attenuation. The time constant is still taken as 10s.