answersLogoWhite

0

The results of World War II did not include the end of political tensions throughout the world, as Western and Soviet policies were almost immediately seen to be at odds. Nevertheless, the world-conquering ambitions of two strong aggressor-nations, Germany and Japan, had been decisively defeated; thus, it may be safely said that, after World War II, the world was, indeed, better off.

User Avatar

Wiki User

11y ago

What else can I help you with?