With respect, really. Women basically won the war for us. They produced all of the ammunition we used, along with the weaponry that we used to win the war. Overall, the men had the fighting spirit, and the women had the productive spirit.
No Jobs and poor treatment
They and their babies were killed.
How did World War I change the way of life for women in the United States?
No, women and men were not treated equally, but for the first time, it came close. People realized that women could do "men's work" and began giving them a slight amount of respect for it, however, at the end of the war, they were expected to go back to being good little house wives and let their husbands resume the jobs.
in world war 1 the women were used to clean wounds and take care of the injured
Yes, there were women in world war 1 and 2. The women had to work on farms and grow food for the men
go out and work
Since many men were away fighting the war, women stepped in to do agricultural work.
Because with all of the men gone at war, there was no one to work, but the women stepped in to substitute for the men.
Women were nurses in World War I there werent any Women soldiers in World War I and few women leaders.they did the mens work. some of them went to work in factory's to produce weapon's, some of them made clothes, also some of them went to war as paramedics or amblaunce drivers.
Yeah, women were treated like dirt during world war 2. Men thought we were useless. But we can fight! (Not that I want to.) Good thing we have respect now.
The men were fighting the war so women went to work.