History, Politics & Society
World War 1
Japan in WW2
US in WW2

What did the US do about European relations after World War 1?



User Avatar
Wiki User

When the United States first entered the WWI, President Woodrow Wilson called it "the war to end all wars". Since the civil war, America had been quite isolationist, refusing to engage in overseas wars (except for the brief Mexican and Spanish-American wars), or join "entangling alliances" (Thomas Jefferson). After the atrocities of WWI, the majority of Americans regretted ever joining the war and resumed their isolationist policy (This is why the Senate did not ratify the treaty of Versailles or join the league of nations, despite Wilson's urging them to. It is even arguable that this is why the United States joined WWII so late). After World War I, the United states stayed as far as possible from European affairs.