Given two lines in the 2D Cartesian coordinate system, represented by Y = a1 * X + b1 ( a1 known as slope and b1 is the Y intercept) Y = a2 * X + b2 Assume the given two lines are never parallel, which means they always intersect. You are required to write a function to compute the intersection point of two given lines. And you should verify your function in a main program by calling the function you wrote