taking Germany's colonies
Germany had colonies in Africa and the Pacific.
No, Germany has no colonies or territories.
Germany
All of Germany's overseas colonies were removed from its ownership under the Treaty of Versailles.
Germany
Taking Germany's colonies Heavy reparations for Germany
Mainly England, Holland and Germany.
Germany lost her African colonies after WW1 so there were no colonies to occupy in WW2.
taking Germany's colonies
germany
No, the League of Nations took control of Germany's oversea colonies