Want this question answered?
It was the Great Depression.
it was brought out by world war II, but most people want to think the war just made the depression worse.
after world war 2
it didnt
World War II, then the Korean War...Vietnam and the Gulf Wars
World War II got the U.S. out of the Great Depression and never got any worse from there.
World War II
No, it was after world war 1.
Because so many supplies were needed for war, bussineses finally began making money again.
The Great Depression ended when the war began. The war demanded industrialization and military power so many men and women were given jobs. (Men- soldiers Women- Factories)
The Depression.
Maybe the war also many people were unemployed at a rate of 25%. The movies
Ended it by providing millions of jobs to the unemployed and they had money to spend stimulating the economy
It was the Great Depression.
World War II is actually what completely brought the United States out of the Great Depression. In the beginning of the war, similar to what we did in WWI, we sold weapons, ammunition, and other war materials to the Allies in Europe. This greatly helped out economy and brought us out of the Depression.
I am not being facetious: The biggest effect of World War I was World War II.
No, World War II ended the depression. When the United States joined the war, many jobs were created.