answersLogoWhite

0

Does Ben Franklin believe in the role of women in America?

Updated: 8/21/2019
User Avatar

Wiki User

10y ago

Best Answer

Although Franklin believed women should be educated, and was quite attracted to women of intellect, he never advocated their being allowed to be leaders in business or governmental affairs.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Does Ben Franklin believe in the role of women in America?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

The changing role of women in 1920s America was embodied by the image of what?

The changing role of women in 1920s America was embodied by the image of the "Flapper".


What changing role of women in 1920s America was embodied by the image of?

The changing role of women in 1920s America was embodied by the image of the "Flapper".


What role did Benjamin Franklin have in the new country of America?

His discovery of electricity was the major role now days


What is the role of men and women in America?

women can vote so can men


What did Annie Oakley do to change America?

She advanced the role of women in America.


What did many men and women believe was women's natural role?

One of subservience


What did Martin Luther believe the role of women was?

The role was to stay in the house and be a Wife and mother.


What role do women have in Congress?

In the United States of America, Congress women play the same role as the men. However, there are fewer women in Congress than men.


How does the role of women in the U.S. differ from the role of women in Saudi Arabia?

The role is different because in America women work do what ever they want and in Saudi Arabia the could not do any thing to help


What were the tensions Benjamen Franklin had?

Benjamin Franklin had many tensions, or struggles. From his role in America's founding and Constitution to his presidency and time as a politician and inventor, Benjamin Franklin faced many stressors.


What was the role of women in America in the 1930?

The role of black women in America in 1930 would be determined by the type of life she lived and the type of education she had. Most were either live in or live out "house servants".


What role did women have in colonial America?

To take care of house needs