yes Germany did hav a colony
Germany
Germany had colonies in Africa and the Pacific.
Germany
germany
Germany lost it colonies in Africa and Asia.
Germany lost her African colonies after WW1 so there were no colonies to occupy in WW2.
After World War one (1919).
Germany lost its colonies in Africa and Asia
Spain, Italy, France, Denmark
Under the Treaty of Versailles, all of Germany's former colonies in Africa and Asia were given to Britain and France.
Germany was a colonizer during the late 19th and early 20th centuries, establishing colonies in Africa and the Pacific.
Great Britain France Germany Portugal Italy Belgium