Below is the definition of the useless function:
1: function useless(n)
2: if n = 1 then
3: return 1
4: else
5: return useless(random(1, n))
6: end if
7: endfunction
where random(1, n) returns a uniformly distributed random integer in the range 1 . . . n.
Assuming an initial call to the function useless(m).
Point: The function will return 1 or will run infinitely.
Q1. How to prove it formally?
Point: Call to random function on line 5 will occurred whenever m is not equal to 1 and it will be infinite until m equals to 1.
Q2. So, how exactly we can calculate the expected no. of calls to random function?
Point: For best case scenario of useless function [useless(1)], calls for random function will be 0.
Q3. But what about the worst case scenario(useless(n) where n not equal to 1), how many times random function will be called?