Well, until they started invading again in WWII.
Mainly to state to Germany that they had lost and to cover up some of the mistakes the allies made.
It was drawn up and signed in the Palace of Versailes, France.
It was drawn up and signed in the Palace of Versailes, France.
Treaty of Versailles, I'm pretty sure that, that's what it called!!
Clandestinely, built up military capability.
Basically the Treaty of Versailles stated that Germany had to take full responsibility for the outbreak of World War I. In the Treaty, Germany was forced to pay $33 billion in reparations to the countries affected by the war. Germany was also required to give up most of its colonies that it had control over prior to WWI. So, I suppose these events would make Germany very sad...
Germany, because it put the blame on them. Also Germany lost a lot of land, and weren't allowed to build up their military due to the treaty.
Peace treaty which marked the end of WW1. Germany had to accept total responsibility of causing the war and pay reparations to certain countries. The treaty also prevented Germany from taking up arms.
Without the treaty of Versailles Hitler may have never came to power. The Nazi party thrived when things looked bad, because they offered ways out, although most of the time they couldn't fullfil, and when times were good people would stay clear of the radical ideas of the Nazi party. When the treaty of Versailles was enabled, Germany lost its pride, and slid into a great depression, Hitler seized this momment, he blamed all the problems of Germany on the Jewish community, although the Jews had nothing to do with this, the people of Germany needed someone to blame and focus their hate on, Hitler gave them this. Most people at this time didnt follow the Nazi beliefs at all, but using negative cohesion he still mananged to gain support amongst the public, Hitler hated the treaty and told the people he would rip it up, he told Germany he would rebuild their army and regain their national pride, all these ideas appealed to the public, so the people supported Hitler at this time. Which is how Hitler used the treaty to gain support.
WW1 was ended with the Treaty of Versailles which blamed the war on Germany. Germany was so mad about this that, for their new dictator, they looked to Hitler for guidance This led up to WW2, in a way.
One of the key purposes of the Treaty of Versailles was to weaken Germany and make France the leading power in Europe. It was not about popularity.
The "Great War" is the First World War. Thus the answer is NO. The Treaty of Paris, signed in Paris, France by representatives of King George III of Great Britain and representatives of the United States of America on September 3, 1783, ended the American Revolutionary War. The Treaty of Versailles was one of the peace treaties at the end of the First World War. It ended the state of war between Germany and the Allied Powers. It was signed on 28 June 1919 at Versailles, France.