After World War I, the Treaty of Versailles in 1919 led to significant territorial losses for Germany. The Allies, particularly France, Belgium, Poland, and Czechoslovakia, took land from Germany, including Alsace-Lorraine, parts of Prussia, and territories that became the Saar Basin and the Free City of Danzig (Gdańsk). Additionally, the treaty imposed restrictions on Germany's military and economic capacity, further shaping the post-war landscape in Europe.
No, France gave in to Germany, and Germany then took over France. We did not help them. Not in World War I, the war the question asks about. Britain joined France in its fight againt Germany during the first World War.
France should because they killed half of everyones (italy,germany,etc) army. and took everyones land.
new york times
about 14 percent
It was the Treaty of Versailles that Germany took FULL RESPONSABILITY for in World War I.Hope this helped you,MichaelaThompson
world war 1 took place in Europe, the middle east and Germany. world war 2 took place in Russia ,Germany, France, and Belgium.
The Treaty of Versailles was what ended World War I. The document placed all of the blame on Germany, even though Austria-Hungary started the war. The Treaty took most of Germany's army, took a good chunk of Germany's land, and actually triggered a significant economic depression in Germany. The Treaty of Versailles was what ultimately lead to World War II.
As the name "World War" might imply, it took place across the world - one land, principally in Europe and Africa, and on many of the world's oceans.
The United States of America, Great Britain, USSR, and France took control of Germany after World War 2.
Canada, the US, Britain, and Russia beat Germany, Italy, and Austria-Hungary. Took money, and land away from Germany and created a depression
It really was not necessary. They took out their anger of the war on Germany. This ultimately caused the start of the second World War.
It really was not necessary. They took out their anger of the war on Germany. This ultimately caused the start of the second World War.
No, Germany did not take over Japan. After world war 2, Germany did throw two atomic bombs in Japan, but they never took over their land because there was no one to rule them because adolf Hitler was dead.
World War one took place in France, Belgium, Germany and Russia.
I think it was Germany...
Because communists took over the control of Russia from the Tsar. Communists were against war with Germany and hence agreed to end the war with Germany after they took the reign of Russia.
France did not lose any land to Germany in World War 1, but Germany had handed over land (Alsace-Lorraine) to France after the treaty in June 1919 which they were not happy about as they had alwaus been enemies of the french. well that's just world war 1