answersLogoWhite

0

Depends on what you call a gain. IMO, the USA did not gain much from the war. America had to assume vast military burdens for other countries. This, along with the huge amounts spent during WWII has left the American government deep in debt. The war altered the American perspective on the world, IMO in a negative way. Before the war the USA engaged actively in trade and economics but avoided foreign entanglements via treaty. After the war the USA entered into a multitude of agreements that have caused the USA to engage in fighting in remote and non strategic arenas around the globe. The USA was founded as a Republic that was meant to stand apart from entangling treaties. Instead the USA operates much more like a nation defending an Empire than a Republic. WWII was the catalyst for this change.

User Avatar

Wiki User

19y ago

What else can I help you with?