answersLogoWhite

0

How did womens rights change America?

Updated: 8/16/2019
User Avatar

Wiki User

16y ago

Best Answer

it gave women the same rights as men.

User Avatar

Wiki User

16y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How did womens rights change America?
Write your answer...
Submit
Still have questions?
magnify glass
imp