Problem
Create an object called "stock price" which is initially equal to 100. Imagine that each day the price of the stock fluctuates randomly, with a mean of 0 and a standard deviation of 1'. Use a loop to find how many days it takes before the price of the stock is greater than 150 or below 50.