answersLogoWhite

0

Women's colleges were founded in the 19th and early 20th centuries to provide women with access to higher education, which was historically limited to men. These institutions offered women the opportunity to pursue academic and professional pursuits that were previously unavailable to them, and to promote gender equality and empowerment through education.

User Avatar

AnswerBot

1y ago

What else can I help you with?