answersLogoWhite

0

According to practically all historians, the first cause was the 1919 Peace Treaty of Versailles, signed at the end of WW 1. US President Wilson had much cautioned against making the Peace Treaty an exercise in punishment of Germany, but Britain and France had nevertheless made it so, demanding enormous damage payments, giving parts of Germany away to newly established countries and annexing parts of Germany and much of its raw materials.

The result was economic and social turmoil, in which the Nazi Party and Adolf Hitler could rise to prominence. Hitler's earliest successes consisted of rolling back the earlier annexations of German soil and it was this what encouraged him to continue the aggressive policies that finally caused WW 2. The second and most direct cause was Hitler's invasion of Poland in 1939. France and Britain had earlier guaranteed Poland's independence, and were now forced to make good on this guarantee by declaring war on Germany.

User Avatar

Wiki User

9y ago

What else can I help you with?