answersLogoWhite

0


Best Answer

technically, Germany was not defeated, they gave in. not surrendered, gave in. after the "defeat of Germany", the leaders of the united states, Britain, France, and Italy got together to settle the war for good. they wrote the treaty of Versailles, which blamed Germany for the war, made Germany pay war reparations, and made Germany take apart its military

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happened after the defeat of Germany in world war 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What Happened Before the Cold War?

The Cold War began after the Second World War ended in Germany and Japan's defeat.


What year did Germany defeat France in world war 1?

It didn't. Germany lost World War I.


Was it harder to defeat Germany in World War 1 or World War 2?

cows with guns


Did Germany defeat Norway in world war 2?

Yes.


How did Germany emerge from defeat at the end of World War 1?

*


Could Germany defeat us in World War 2?

No


Did this happen in World War 1 or World War 2 ended with Germany defeat in 1945?

World War 2


In which war did the US along with its allies defeat Hitler's Nazi Germany?

world war 2


What was Germany's aim during World War I?

To defeat Britain,France and Russia.


What did the US have to do when World War 1 began?

Defend its Allies and Defeat Germany


Can Germany defeat US in war?

My assumption. Germany cannot defeat US. But France can. But Germany can defeat France.


When did us defeat japan in World War 2?

Japan gave up after Germany lost the war.