A test set has a 99% probability of correctly detecting system fault and a 2% probability of declaring a false alarm (i.e., a fault that is declared is a mistake). If the system actually has an actual probability of a single fault of 1%, what is the probability that the test set will accurately detect it? (Hint: Let F be the event that the fault occurs, and D be the event that the test set correctly detects a fault that occurs.)