Two archers shoot arrows in the same direction from the same place with the same initial speeds, but at different angles. One shoots at 41.0 degrees above the horizontal, while the other shoots at 56.0 degrees. If the arrow launched at 41.0 degrees lands 220m from the archer, how far apart are the two arrows when they land? (You can assume that the arrows start at essentially ground level.)