answersLogoWhite

0

Interesting that WWI changed the rest of the world much more than it did America. The USA was changed much more by WWII. By the end of WWI, the monarchies that had ruled Europe and thus most of the world for many centuries had largely disappeared. The seeds were laid for the dissolution of the great imperial empires of Britain and France although it would take another half century to complete the process. The USA changed very little however. WWII was different for the USA. By the end of the war, Americans had become much more involved in world affairs and so continue to this day. The mass involvement by women and blacks in the military/industrial effort laid the groundwork for the civil rights movements of the 60's and 70's. Thus the war put the USA into a definite world leadership position politically and militarily for the first time in history.

User Avatar

Wiki User

19y ago

What else can I help you with?