answersLogoWhite

0

The end of the first world war gave women more of a right and say in the workplace, as they gained many jobs during the time their husbands were off fighting, and they were able to maintain those.

User Avatar

Wiki User

12y ago

What else can I help you with?