The direct effect of World War I on America was a significant economic boost, as the U.S. became a leading supplier of war materials to the Allies, leading to increased industrial production and job creation. This economic growth contributed to the nation's emergence as a global power. Additionally, the war led to social changes, including the Great Migration, where many African Americans moved north for better opportunities, and the women's suffrage movement gained momentum, culminating in the 19th Amendment in 1920.
America didn't fight in world war 1
One negative effect on America from ww2 was that the economy resulted in being in a horrible state because of all of the spending done. Another negative effect was the amount of Americans who lost there lives as a result of this horrible war.
Because America had weapons loaded in a ship to supply Russia in WWI. Consequently, German submarines sank them. So America got furious and joined the war. Their wasn't that much affective
world war 2
It helped win the war
which of the following was not a direct consequence of america's victory during World War 11
The American government had no direct effect on the Holocaust. Obviously, the defeat of Nazi Germany by the Allies ended the genocide. I wonder if you are mistakenly equating the Holocaust with World War 2.
it effected 69 people. lol
ww2
When the war ended, million of unemployed Americans returned to work to make weapons.
I am not being facetious: The biggest effect of World War I was World War II.
Killed 1.2 million americans affected american economy they built an alliance Made america feel proud and gave them a good reputation.
America didn't fight in world war 1
america entered the war after pearl harbor
it has no effect
One negative effect on America from ww2 was that the economy resulted in being in a horrible state because of all of the spending done. Another negative effect was the amount of Americans who lost there lives as a result of this horrible war.