answersLogoWhite

0

The War caused changes Germany and Russia were not prepared for. WWI thrust the United States into the role of world leadership, and was the beginning of the decline for England. The world's financial center shifted from London to New York, and England became heavily indebted to the United States. Women acquired the right to vote throughout Europe. The war stimulated the U.S. economy, and increased employment as women entered the work place. France suffered untold property damage. Germany suffered severe economic difficulties causing despair, hardship, and an uncertain future.

User Avatar

Wiki User

14y ago

What else can I help you with?