answersLogoWhite

0

No, The war actually brought the United States out of the Great Depression. Increased war production and mobilization of the entire country saw our industrial output skyrocket.

User Avatar

Wiki User

14y ago

What else can I help you with?