Interesting that WWI changed the rest of the world much more than it did America. The USA was changed much more by WWII. By the end of WWI, the monarchies that had ruled Europe and thus most of the world for many centuries had largely disappeared. The seeds were laid for the dissolution of the great imperial empires of Britain and France although it would take another half century to complete the process. The USA changed very little however. WWII was different for the USA. By the end of the war, Americans had become much more involved in world affairs and so continue to this day. The mass involvement by women and blacks in the military/industrial effort laid the groundwork for the civil rights movements of the 60's and 70's. Thus the war put the USA into a definite world leadership position politically and militarily for the first time in history.
what i want to know is how did World War 2 affect money for education and in Japan what did the boys do during the war
A PFC in the US Army paid $6.50 as a monthly premium for $10,000 worth of life insurance in World War 2.
Led to Cold War
we dont! LOL
This was Americas (and the worlds) first major war of the atomic age. It taught the US (and the world) how to fight a limited war (limited to conventional weapons...no nukes).
they sent their animals
world war one affect the us they had get a alot of money
hu
The United States emerged from the war as superpower.
what i want to know is how did World War 2 affect money for education and in Japan what did the boys do during the war
the war affected it greatly
Yes.
World War I affected the US by making it in charge of all other world powers. In other words, the war boasted the country into becoming the world's superpower.
it just changed everything
lazaj jubbi frist they fight for the world
hard
The US Civil War and World War I.