it was a peace treaty that ended ww1
hth
woodysgamertag
the report strengthened isolationist statement in the United States
Isolationism arose after World War 1 because hundreds of thousands of American men were killed needlessly in a war that really had nothing to do with the United States. Many Americans wanted to stay out of all wars that did not involve the USA to avoid needless death and expense.
The U.S. continued to stay an isolationist country because they felt like they shouldn't get involved in other countries business because that's the same reason why they went into WW1.
the central powers were taking a lot of countries over and when the us joined it was in 1917 because they were tired of being an isolationist.
maybe, but almost certainly not for a minimum of 2 to 3 years. the US populace was strongly isolationist from the end of WW1 until we were attacked and realized the oceans could no longer protect us.
isolationist
Yes, it did.
The US stopped pursuing an isolationist foreign policy after it was dragged into World War 2 and found itself a major power.
Because they were natural.
they enjoyed it
YES. The Neutrality Acts reflected the US popular support of isolationism.
At the beginning of World War I, the US assumed an isolationist policy which basically means that we refused to get involved. While remaining isolationist, the US tried to get the other countries involved to declare peace.
it was a peace treaty that ended ww1
Yes in WW1 and WW2 the French were allied with the US.