answersLogoWhite

0

No, Germany didn't become an empire after WWI, so Hitler didn't control it.

User Avatar

Wiki User

13y ago

What else can I help you with?