The sun is a sphere with radius 6.96 x 10^8 m and average surface temperature of 5800 K. Determine the amount by which the sun's thermal radiation increases the entropy of the entire universe each second. Assume the sun is a perfect emitter (that is, e = 1), and that the average temperature of the rest of the universe is 2.73 K. Do not consider the thermal radiation absorbed by the sun from the rest of the universe.