answersLogoWhite

0


Want this question answered?

Be notified when an answer is posted

Add your answer:

Earn +20 pts
Q: What was the fate of Germany and colonies at the end of World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

World War 2 the conference that decided fate of Germany?

The Potsdam Conference.


After world war 1 who lost their colonies in africa?

Germany


Fate of Germany naval ships after World War 2?

Sunk and scrapped (re-cycled).


What happened to Germany's African colonies after World War 1?

All of Germany's overseas colonies were removed from its ownership under the Treaty of Versailles.


When did Germany loose its colonies in Africa and Asiafrica?

After World War one (1919).


What was a direct result of World War 1?

Germany lost its colonies in Africa and Asia


What happend to Germany's colonies after World War 1?

after ww1 germanys colonies went down hill and fell apart because they lost jobs because of the war most woman had to do jobs for men because they were at war japan gained some of Germany's colonies


Did World War I lose colonies?

World War 1 never had colonies, nor has any other war before or after it. It did however cost the nations who fought in it numerous colonies, most notably Germany, who lost large possessions in Africa and Asia.


What was parts of Frances goal after the war?

taking Germany's colonies


Did Germany want a general war?

Germany did want a general war. Germany wanted to become a dominant world power by replacing Britain, but they needed more colonies to do so.


How did the loss of colonies in Africa lead to Germany starting world war 2?

It didnt. Loss of conlines, if that even happened, was not a reason that germany started the war


What happened to the colonies once ruled by Germany after World War 1?

The resolution to the war took them away from Germany. Most were given to England or France to govern.