answersLogoWhite

0

American attitudes towards World War 1 at its outbreak were most feelings of indifference. At the time, most Americans had no interest in exerting American interests on other countries and favored isolationism. However, several incidents shifted American opinion until eventually the US declared war on Germany, with full public support.

User Avatar

Wiki User

10y ago

What else can I help you with?