answersLogoWhite

0

World War I led to World War II in the fact that after the war, all the countries signed a Treaty called the Treaty of Versailles in which Germany had to give up some land, they had to demilitarize, they had to take full blame for the war, and pay reperations to the Allied countires. From this Germany wanted revenge on the Allied countries.

User Avatar

Wiki User

15y ago

What else can I help you with?