Common jobs for women in the US, as well as in the majority of the Western Nations, included domestic duties such as working around the house and minding the kids. However, this is quite the stereotypical look at women, as their rights were slowly but surely being ascertained. Others, I suppose, would have included the more feminine appearing professions, such as working in stores and retail type employment. Shortly before the 60s era (when the man of the house was off at war) women also had the dreary task of working in factories and industry buildings. The wage was low, but it kept them and their families from starving.
I can't really say much else. This was my question
Women worked jobs that had been held almost exclusively by men.
It provided jobs for unemployed men and women
The bikini became popular in France in the 40s, but hit the US big-time in the 60s, turning into the staple beach-wear as it is today60s
It created new jobs in the service industry, but led to a decrease in manufacturing jobs
$1. It is struck in brass, is common and only worth face value.
300,000 US women joined up in the military branches of the US Forces but NOT TO FIGHT in combat. They did clerical jobs, nursing, flying planes and other terrific jobs in the military then.
Women got better paying jobs that arent so dangerous.
Women worked jobs that had been held almost exclusively by men.
It provided jobs for the women, who usually worked on the farm and/or was a stay at home mother.
because its part of our jobs to annoy the crap outta men!
breast cancer surgery
Well-paying jobs ~~ Apex
This is not a question. Tell us what you want to know.
turkey :)
Geography, types of weather, types of jobs, etc.
It provided jobs for unemployed men and women
Women in their 20's.