answersLogoWhite

0


Best Answer

How and why did the role of women change after ww1 ?

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

11y ago

women sometimes helped thus becoming respected

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
3y ago

During the first world war, while all the men joined the army to fight in the war, the women left behind at home worked in the factories making machinery and weapons for the soldiers to use overseas. This caused them to not have anybody bossing them around and they got a sense of liberty that they campaigned to keep after the war.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How did women's roles change during the Great War?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How did the war change womens roles in society during ww2?

With the men away fighting, the women took over many of the jobs traditionally done by men.


Did the colonist have jobs?

womens roles


How have womens roles changed when males have been away from home such as during war or other long separations from the family?

women roles have changed because they cheat more now


What were Womens Role in World War 1?

womens roles were to heal th wounded and to kep them stron


Womens current roles in the military?

what was the women's role in the military


Womens roles in todays society?

making sammiches in the kitchen.


What are the roles of the countries in world war 2?

roles of countries did not change during the war, they were the same as before and afterwards.


What are traditional womens' roles and jobs after World War 2?

being a prostitute


What were Womens roles in 1900?

To work around the house and take care of the children.


Why were womens roles played by men in plays?

Women were prevented by law to act.


Are the roles of white men and women in the South differ in Modern the US?

not really most of them are the same exept the womens roles are totallydifferent


Do womens roles in society have to do with politics?

Because politics have granted women their rights, the answer is yes.