answersLogoWhite

0

Germany lost World War 1 and they went into a depression. The Germans naturally were angry. When Hitler came along and promised them revenge and wealth they believed him. They were so desperate they believed anything and everything he said. Later, he invaded other surrounding countries and this started the war.

User Avatar

Wiki User

13y ago

What else can I help you with?