It allowed America to show off its totally wreckin' guns and general badass-ness.
The USSR, and the United States of America.
America was seen as a defender of liberty.
World War 2 brought about a lot of migration to and from war town areas. The end of the war brought a baby boom to America from returning soldiers and citizens celebrating the end of the war.
america entered the war after pearl harbor
Yes, they were allied during world war 2, with the one of the most significant examples of this being when the two countries met at the Elbe river, and happily embraced each other.
it doesnt
world war 2
He helped win the war.
A few major events occurred thanks to World War 2. A major cultural shift was the spotlight shed on women via Rosie the Riveter. As well, after World War 2, the US's economy was booming, due to the production of weaponry for the war bringing in a lot of revenue.
America
the us dollar became international after the world war 2.
none America didn't fight until world war 2