answersLogoWhite

0

What happened in US after WW1?

Updated: 8/19/2023
User Avatar

Wiki User

15y ago

Best Answer

WWI had the most devastating effect on Germany (whose economy was crushed into oblivion). This led to WWII. The US lost many soldiers, but their casualities and fatalities were miniscule compared to those of the other countries involved. Germany lost the most people. The blunt answer is that WWI led to WWII. You might want to research the Treaty of Versailles. Two websites I ABSOLUTELY must recommend are: www.worldwar1.com and www.firstworldwar.com There you will find more information than you could ever use, including primary documents and art/songs/poems. They also have timelines, etc. It is VERY helpful in a WWI project I'm in the middle of. The state of the world was damaged politically, socially, and economically.

It was damaged politically by the the Industrial output being low, and the international trade being innterupted (after all, the war alone cost 45,000 million).

All political systems changed dramatically; the Russian monarchy was overthrown by communism, Germany's monarchy was overthrown and a democracy set-up.

There were millions of refugees, and many areas lay ruined. It was socially damaged by 10 million men dead, millions wounded and/or disabled, 5 million widows, 9 million orphans and Victors going into mourning. The shortage of men meant an unstable Birth Rate. Finally, many soldiers were physichologically damaged due to shell shock, gas and bomb attacks, loss of limbs, death of comrades and family, etc.
militerism is basically the bulding up of armies and preparing for war.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

8y ago

The aftermath included a decrease in the population of nations included in the War, huge war debts were immediately contained in places like the U.S were the Stock Market fell and Germany were they had to pay off most of the war debt because they lost the war. Also advancements in Science and Technology decreased. Lastly the loss of a complete generation of Men and Women included in the war.

This answer is:
User Avatar

User Avatar

Wiki User

15y ago

The economy sky rocketed for a while then in the 1930s we hit depression The economy sky rocketed for a while then in the 1930s we hit depression The economy sky rocketed for a while then in the 1930s we hit depressionThe economy sky rocketed for a while then in the 1930s we hit depression

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

Industry and the economy grew in the US as an effect of World War I. In the absence of a big number of able-bodied men, more employment opportunities opened for women. The government also adopted new diplomatic policies and developed an anti-war sentiment that affected the country's entrance into World War II.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

It's hard to see what long term effects WW1 had on America.

America had a policy of isolationism during WW1.

America whilst officially neutral in WW1, armed ships supplying Britain with arms and supplies, whilst preventing the shipment of supplies to Germany. It also supported Britains blockade of German ports and the use of mines in international waters.

The Germans tried to encourage Mexico to declare war on the US, offering them the chance to reclaim American states that used to be part of Mexico (Namely Texas, New Mexico, and Arizona.)

This, along with the sinking of the British liner Lusitania with 128 Americans on board meant America declared war of Germanny in 1917.

American participation involved the deployment of US battle ships to Scapa Flow (The Royal Naval base in Scotland) And the deployment of several units of US Marines to France as part of the American Expiditionary Force.

After the end of WW1, America once again turned to isolationism. This meant that once again the US remained officially neutral during the initial stages of World War 2.

America also helped significantly in the rebuilding of Germany, this would become an ebarresment with the Rise of the Nazis.

Japans attack on pearl habour was prompted by a belief that the US wouldn't fight but would instead capitulate.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happened in US after WW1?
Write your answer...
Submit
Still have questions?
magnify glass
imp