World War 2

Did the Americans win world war 2?

User Avatar
Wiki User
September 12, 2011 9:27PM

The Allies won WW2, and the US was one of the Allies.

I'll add to it....

Out of all of the major players of the war, the U.S. gained the

most in the outcome. For example, the world financial system under

Bretton Woods was thus pegged to the U.S. Dollar onwards. The U.S.

was at the time the world's creditor. Secondly, the British Empire

was disbanded, thus overseas markets opened up to American trade

and investment. Thirdly, the U.S. held a monopoly on atomic and

nuclear weapons.

Tactically, the U.S. is generally accredited for winning the war

against Japan, albeit minor help from the Australians.

Simply put, the U.S. and the U.S.S.R. to an extent emerged as

the two world's superpowers after World War Two ended.

Just to add out of respect:

Lets give a shout out to the british and the thousands that died

for them too.They held off the Germans longest and without them the

outcome probably would have been Nazi victory.

Overall The Nazi defeat was an American, Russian and British

victory. Only together could WE win.

God Bless America.

God Save The Queen.

Долгосрочной службы Россия


Copyright © 2020 Multiply Media, LLC. All Rights Reserved. The material on this site can not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Multiply.