What happened after World War 1 ended?

After World War I ended, the Treaty of Versailles was signed. It declared what would happen to the losing countries. Specifically, Germany bore the brunt of the punishment. They owed so much in reparations that they only finished paying it off in 2010.