answersLogoWhite

0


Best Answer

OK so basically it did and it didn't. firstly during WW. this was the first time women had been allowed to do work out side of the home and not be judge for it. it become very common for women to be working in factories and agriculture because all of the men were away fighting in the war. Strong opposition to women suffrage changed their opinions, and with out the war women definitively wouldn't have got the vote in 1918. so obviously in the first war their status did improve slightly. How ever after the war their roles was pushed back into the home, and only 11% of women who were previously in work were left in work. on to world war two, now this was much better for women having only 8 million women being employed in factories and agriculture. they put their lives on the line in munitions factories and there is no way we could have won the war without them. during WW there was also groups such as then women terestrial army, and the WRAF woman's royal air force- this particular group allowed women to fly in plains abroad to occupied France to deliver the planes to the solders. furthermore the introduction as using women as spies was introduced. this was such a huge improvement and women loved it, even former PM Asquith said that 'we couldn't have won the war with out them' and that 'they have proved them self in society and are truly equal and capable'. he was previously one of the biggest opposition to female suffrage. However once again after the war the government pushed women back into the home using slogans such as 'kill germs, not Germans' and advertising showing women looking glamorous while doing house work. one thing did stay the same however and that was the style change, it was know OK for women to be out in public dating, wearing trousers and having short hair and skirts, it was now seen as OK to smoke and women had a new lease of life. however the biggest changes came in the 60s, the film 'made in daganem' is very informative about this, new fashion came in and it was a hole new era for women confidence and feminism. also the equal pay act was introduced (made in daganem is about this).

chucking up the deuce, i'm out dawgs!

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

10y ago

Both. You are asking about a long period of time-- more than six decades-- and during that time, some things got worse, and others got better. For example, when the war ended, many of the women who had excellent and interesting jobs were forced to give them up so that the returning men could have them. Granted, some women wanted to stop working and begin a family now that their husband had come home, but other women liked their job and wanted to keep working. In all too many cases, the women were given no choice; they had to quit whether they wanted to or not. During the fifties, the country became very conservative, and women were expected to stay at home and be traditional housewives and moms. Girls in high school were often discouraged from going to college, and there was pressure to marry young and have kids. Again, nothing wrong with marrying and having kids, but those of us who wanted to wait, or who wanted a career, were made to feel as if we were not normal. it should also be noted that America was still segregated after the war, so black women were still stuck with limited options, even more limited than those of white women.

In the sixties, things changed again, as the Civil Rights Movement, Women's Rights Movement, and Gay Rights Movement called attention to discrimination, and gradually, more options presented themselves. Being different was not such a terrible thing anymore, and a number of young women decided to pursue careers or wait a while before getting married. There was still prejudice, of course, but there were also a lot more opportunities to carve out your own path. One thing that helped was greater availability of birth control-- many states had restricted who could get Birth Control, but the last of those laws was overturned in the mid-1960s, so women had a say in whether to get pregnant or not. Other laws changed too, making it illegal to refuse to hire a woman just because she was female. By the early 1970s, you found more women in non-traditional fields like broadcasting, law, medicine, and business.

I could continue with each decade, but you would see recurring themes: some eras were filled with many new opportunities-- we saw more women entering politics, for example, and the current US Senate has more female senators than at any time in history; but other eras seemed to revert to an earlier time when women were only supposed to be homemakers (or perhaps secretaries, nurses or teachers). The point is that life is rarely all black or all white, all good or all bad. The pendulum seems to shift periodically. After World War II, sometimes it has seemed women would be free to forge their own path; but at other times, it has seemed society (and traditionalist men) wanted women to go back to the way things used to be.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did women's lives change for better or for worse after World War 2?
Write your answer...
Submit
Still have questions?
magnify glass
imp