answersLogoWhite

0

A very, very complicated questions that would have to begin in the 1930's with the rise of the Nazi party in Germany. Actually the Treaty of Versailles ending WWI has been called the cause of WWII. You would have to read a lot of history books. The Allies stormed the beaches to gain back the land (France) that they had been taken by Germany and then moved into France and Germany, the Soviets were attacking Germany from the East. Germany lost the day they attacked the USSR.

User Avatar

Wiki User

13y ago

What else can I help you with?

Related Questions