answersLogoWhite

0

WWI ended with Germany being responsible for the war. It lost territory in Poland, and czechoslovakia. It also put Germany in economic trouble. Hitler promised to make Germany great again. Then he went out and took back territory lost peacefully. Until Poland where he invaded with his military. Then countries were declaring war on each other.

User Avatar

Wiki User

14y ago

What else can I help you with?