World War 2
Britain in WW2
US in WW2

What affect did the war in Europe have on the US?

User Avatar
Wiki User
December 15, 2011 3:27PM

Some people don't understand that the vast majority of Americans during the thirties wanted the US to stay out of World War II. Europe was an ocean away, and the general thought of isolationists here was that Europeans had bickered for thousands of years. We shouldn't interfere with European problems because they can't be solved. However, when Japan attacked Pearl Harbor, FDR had a reason to throw more than just material support into the war effort. While some could argue the New Deal took the edge off the Great Depression, I would argue that the war truly ended it. Furthermore, we as a country began to understand that we are part of a global community in which we cannot hide from the rest of the world. The United States emerged from World War II as a superpower, but we were immediately thrust into other conflicts... most notably the Cold War. and YOUR MOM