An operator tells you that it takes 2 minutes for a process to get half way to the final value after a step change in the input. What can you infer about the process time constant (tP)?
Assume dead-time (?P) is negligible and a linear first-order system.