Was Germany colonized?

No, Germany started out as a Baltic state of the Teutonic Knights. It later was called Prussia up until WWI, and from the end of WWI until the present day it has been called Germany.