What happened to Germany after World War 1?
A variety of things, including:
Borders were redrawn, causing them to lose territory to certain countries (like Alsace-Lorraine to France).
The Reich (Empire) was dissolved and the Kaiser was eliminated.
In the Treaty of Versailles, Germany was in essence, blamed for the war, and was forced to play reperations to other countries for their expenses.
Restrictions on military buildups were made.
The new government (the Weimar Republic) was built on a shaky foundation. Following the harsh terms of the treaty, and the general economic impact the war had, inflation was rampant and many people were unemployed.
Adolf Hitler and the National Socialists (Nazis) used the poor economy, the treaty, and other nationalistic and Anti-Semitic feelings to take control in the 1930's.
technically, Germany was not defeated, they gave in. not surrendered, gave in. after the "defeat of Germany", the leaders of the united states, Britain, France, and Italy got together to settle the war for good. they wrote the treaty of Versailles, which blamed Germany for the war, made Germany pay war reparations, and made Germany take apart its military
WW1:1914 ww2:1939 World War 1 began in July 28 1994. It happened in Austria-Hungary when Arch Duke Ferdinand was assassinated by a group of terrorist apparently from Serbia. Austria-Hungary had a military alliance with Germany as Serbia did with Russia. As Germany declared war on Russia, France declared war on Germany so when Germany went through Belgium to get into France they didn't realise Britain had a military alliance with Belgium. At that point it…