answersLogoWhite

0

What happend to Germany's colonies after World War 1?

Updated: 8/16/2019
User Avatar

Wiki User

12y ago

Best Answer

after ww1 germanys colonies went down hill and fell apart because they lost jobs because of the war most woman had to do jobs for men because they were at war

Japan gained some of Germany's colonies

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happend to Germany's colonies after World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How much was Germanys war debt after World War 1?

37 billion dollars


What was parts of Frances goal after the war?

taking Germany's colonies


What happend to the Spanish colonies after the Spanish American War?

They became US Territories.


Germanys allies in World War 1?

Germany was in an alliance with Austria-Hungary and Italy.


Why did the world go into World War 1?

Austria-Hungary declared war on Serbia then with Germanys backing attacked France & England ... in a nutshell


Why did the map of Europe change significantly from World War 1 to World War 2?

because the communist and germanys country lost power


What are the major events leading to world war 1?

The asssination of Germanys Archduke began everything.


Who led Germanys rise to power just before World War 2?

Adolf Hitler did.


What happend to the states after World War 2?

they were destroyed by the war.


What happend you World War 2?

nothing


What happend in 1992 in World War 2?

The Second World War had finished by 1945.


Germanys invasion of this country triggered world war 2?

Poland, on 1 September 1939