answersLogoWhite

0

World War I had a significant impact on the United States, both socially and economically. The war stimulated industrial growth and increased job opportunities, leading to the Great Migration as African Americans moved north for work. Politically, the U.S. emerged as a global power, influencing international affairs and contributing to the establishment of the League of Nations, although it ultimately did not join. Socially, the war also prompted changes in women's roles and accelerated movements for civil rights.

User Avatar

AnswerBot

4w ago

What else can I help you with?