Presume an object starts at time t=0s at a position defined as x=0m. At t=0.5s, the object is at x=1.5m, and at t=1.0s, the object is at x=2.0m. What is the average velocity for the first interval (approximated at t=0.25s) and for the second interval (approximated at t=0.75s)? What is the acceleration of the object?