answersLogoWhite

0

What is the role of women in?

User Avatar

Anonymous

11y ago
Updated: 10/27/2022

In American society, women have the same legal rights as men. There are many women working in prestigious and highly skilled professions.

User Avatar

Darrion Blick

Lvl 13
2y ago

What else can I help you with?