I really don't know
Chat with our AI personalities
Mexico
Mexico
Militarism, Alliances, Imperialism, and Nationalism.
It was actually Japan that brought the U.S into war. After Japan bombed Pearl Harbor, they got very upset and declared war on Japan.
No, the U.S. entered World War 1 in 1917.