they hated women working because men liked to think they were higher up than the womanx
A white feather.
yes. they were not being paid as much as the men were and most employers were reluctant to employ women over men anyway. they were also put at risk when working and not given the appropriate clothing for safety.
£2.15
they were expected to give up their jobs
while men were away fighting the war, the women decided to take up the jobs that were initionally for men. And then when the men came back they seen that women were cababile to do these jobs.
their roles in the war for women was them serving as nurese and the African American men fighting in the war by themselves not with the other men
they continued on life or they went to work.
one way was they worked for the men while they were gone.
During WW1, the U.S. was in total war, Where they reformed every part of their lifestyle to help the war effort. All the men were either fighting in the war or working for companies that were producing goods for the war. There were more jobs to do than men to work them, so companies started to employ women to work. Since then, women have been working about just as much as men have. Hope this helps!!!:)
With most men fighting at the front, it was left to the women to take over traditionally male jobs. Working in factories, driving the buses, trains and trams, etc. With the war over, and the returning male survivors, a lot of women resented having to return to the so called "Womans' Role" in the home. Many women (and men) began working for the emancipation of women - which is still not fully achieved, even in 2017 - though it is better that it was in the 1900's.
Women have been working hard since the beginning of time.
After world war one, women started demanding for equal rights as men. Women were give the right to vote, work at similar places as men, the right to divorce men, right to have education and the right to have their own possessions.