answersLogoWhite

0

It does somewhat. Some of the same animosities that contributed to WWI lead to WWII. And the milltary-industrial complex built up in WWII made the USA the world powerhouse that it is today.

yea because world war I caused WWII and WWII brought the US out of the depression and if WWII didnt happen with Hitler and everything i wouldn't have been born.

User Avatar

Wiki User

13y ago

What else can I help you with?