World War I influenced culture and society by causing millions of people to be displaced and by pushing the United States into a position of dominance. The war also caused women's rights to be examined because women took on so many of the jobs most commonly accomplished by men.
Anything, war, or if a couple of people moved with different culters moved and someone else decided to start a new culture to change. Or if during the war and the other culture lost then they could change their culture.
France tried, through its culture, to form a friendly relationship with Germany. At the time of the war, France didn't have warm, fuzzy feelings toward Britain. France tried hard to make Germany see how important it was to the world and to Hitler.
Wars have resulted in changes in culture over the years. This is due to the many casualties that are suffered during the extent of the war. War affects those involved, in addition to their friends and family.
One major new aspect of American culture that emerged after the Revolutionary War was nationalism. Before the war, the colonies were fragmented in their allegiances. After the victory in the war, most Americans were proud to part of the new nation.