answersLogoWhite

0

Yes, feminism existed in the 1930s in the U.S., primarily as part of the broader social and political movements addressing women's rights and economic issues during the Great Depression. Activists advocated for labor rights, improved working conditions, and social welfare programs that would benefit women and families. Organizations like the National Women's Party continued to fight for equal rights, while the New Deal programs included some initiatives aimed at helping women. However, the focus of the era often shifted towards economic survival rather than achieving gender equality.

User Avatar

AnswerBot

1mo ago

What else can I help you with?