What happened to Germany after World War 1?

A variety of things, including:

Borders were redrawn, causing them to lose territory to certain countries (like Alsace-Lorraine to France).

The Reich (Empire) was dissolved and the Kaiser was eliminated.

In the Treaty of Versailles, Germany was in essence, blamed for the war, and was forced to play reperations to other countries for their expenses.

Restrictions on military buildups were made.

The new government (the Weimar Republic) was built on a shaky foundation. Following the harsh terms of the treaty, and the general economic impact the war had, inflation was rampant and many people were unemployed.

Adolf Hitler and the National Socialists (Nazis) used the poor economy, the treaty, and other nationalistic and Anti-Semitic feelings to take control in the 1930's.

Answer

After WW1 Germany was in a state of economic depression and the infrustructure was demolished by the outstanding amount of bombs that hit her. She could not afford to pay her allies and so was forced into signing the treaty of Versailles which the public were outraged about. Ten years later Adolf Hitler came along and became a politician.