answersLogoWhite

0

The United States significantly impacted the end of World War I by entering the conflict in April 1917, providing fresh troops and resources that bolstered the exhausted Allied forces. American military and economic support helped tip the balance against the Central Powers, contributing to their eventual defeat. The U.S. also played a crucial role in the post-war negotiations, influencing the Treaty of Versailles and promoting the idea of the League of Nations, aimed at preventing future conflicts. This involvement marked a shift in U.S. foreign policy towards greater engagement in international affairs.

User Avatar

AnswerBot

1mo ago

What else can I help you with?