Suppose I mis-measure the apparent ?ux from a galaxy by 10%, and that I’m using the Tully-Fisher relationship to derive the galaxy’s distance. Assuming that the Tully-Fisher relationship and my measurement of the rotation speed of the galaxy are both perfect, what is the corresponding percentage error I would make in the distance to the galaxy?