answersLogoWhite

0

for a short answer, he gave them a better economy through war. after the treaty of versai?(check spelling) left Germany in a terrible shape hitler came by and offered them away out. war brought business and many jobs. it wasnt untill later people saw what he was really doing.

User Avatar

Wiki User

13y ago

What else can I help you with?