answersLogoWhite

0

Well the fallout in Germany allowed the Nazis to come to power, which led to WWII and more changes... I mean that is just 1 thing, a better question could be how did WWI not impact western cultures?

User Avatar

Wiki User

14y ago

What else can I help you with?

Continue Learning about History of Western Civilization

How did Americas feel about western culture as a result in world war 1?

After World War I, many Americans experienced a complex relationship with Western culture, characterized by both disillusionment and a desire for progress. The horrors of the war led to a questioning of traditional values and a rise in modernism, influencing art, literature, and social norms. Additionally, the war fostered a sense of nationalism that contributed to an embrace of American cultural identity, while also prompting a fascination with European avant-garde movements. Overall, the post-war period saw both a rejection of pre-war norms and an exploration of new cultural expressions.


How did cold war tensions affect the relationships between eastern and western Berlin?

Figure it out on your own!


What new development in western society and culture came about after world war 1?

After World War I, Western society and culture experienced significant changes, notably the emergence of modernism. This movement embraced new artistic expressions and challenged traditional norms, reflecting the disillusionment and trauma of the war. The 1920s, often called the "Roaring Twenties," saw shifts in social attitudes, including increased freedoms for women, the rise of jazz music, and the flourishing of avant-garde literature and art. These developments contributed to a sense of cultural experimentation and a break from the past.


Where does the western front in world war 1 start and end?

france


What provided financial aid to war torn Western Europe after World War 2?

Marshall Plan