answersLogoWhite

0


Best Answer

One result of the end of WWII was its negative effect on the female workforce. Male soldiers returning to the U.S. after the war sought work in factories and farms, where women had been working during the war. Employers, perhaps operating under the sexist notion that men were superior to women, fired their female employees and hired men in their place. Thus working women, who had enthusiastically stepped into the breach to take on essential jobs during the war while men went off to fight, were stripped of their relatively high-paying duties. Their options were to become or go back to being homemakers, or take on low-paying jobs traditionally associated with women, such as teaching, nursing, housekeeping and clerical work. But American women remembered the pride and satisfaction that the "forbidden" jobs had brought them. Two to three decades later, during the Feminist Movement, women would rise to challenge antiquated assuptions about a woman's place in the workforce and society. Now there are women in the highest ranks of industry, finance, politics and other fields that were formerly off limits, thanks in part to the working spirit that was awakened in them during WWII...and the crushing blow that came afterwards.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happened with women after World War 2 here in the US?
Write your answer...
Submit
Still have questions?
magnify glass
imp