Did the US become a global imperialistic and military power slowly and reluctantly (ergo, it was pushed into by circumstance), or did they intentionally set to imperialize from the period of 1890-1920?
Details for the argument are appreciated! There needs to be at least TWO arguments for whatever side you choose and ONE against it to so impartiality, but anymore arguments will also be accepted happily.