What changed the world after the First World War?

After the first World War, there were many global changes caused by the Treaty of Versailles. For one thing, the German provinces of Alsace and Lorraine became independent, and Germany lost the vast majority of it's military force.