answersLogoWhite

0

There is no doubt that Germany is considered to have started WWII by invading the countries around it. Hitler saw that the Germany needed to make itself a powerful nation again, something they had lost when the treaty of WWI was signed. He built up his army and military equipment, attacked parts of Europe that he thought belonged to Germany in order to get it back, and then attacked other countries when he saw that no one in Europe was going to oppose him. Some may say that Japan started the war because of the earlier invasion of China and Russia, but others say that if that had been the only war, it would never have turned into a World War. Russia defeated the Japanese and would have come to the aid of the chinese. Any more discussion becomes very complex and detailed.

User Avatar

Wiki User

17y ago

What else can I help you with?