answersLogoWhite

0

How did women gain the right to work?

Updated: 8/19/2019
User Avatar

Wiki User

12y ago

Best Answer

Women never gained "the right to work" but in the US, it wasn't until a bit after WW2 that more than 50% of women were working

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How did women gain the right to work?
Write your answer...
Submit
Still have questions?
magnify glass
imp