During the Civil War, many women did traditionally male jobs like farming because most men were fighting.
They were allowed to vote.
Women changed their look to a man to fight in the civil war because women were not allowed to fight in the civil war at that time period
It was partially to help change the role of women, it definately did help their cause though after they told the men they were fedup about how they were treated
The question as asked appears nonsensical. The Civil War was fought to save the Union. The South seceded over states rights and property, which was a euphemism for Slavery. Women's rights played no part in the Civil War. Women's Rights came into play only after the Civil War ended.
The Union was the North and the Confederate States were the South.
Woman's liefs changed during the civil war because they were having sex with several boy because it was fun
*The civil war *The War between the States
Women continued their pursuit for equal rights following the Civil War. During the war the women gained respect for proving that they could take control of the responsibilities the men had to leave when they went off to war while still maintaining their homes and family. This helped the women's rights movement immensely.
Women typically were home makers before and after the American Civil War.
The Civil War in the United States was created in 1937.
George Washington changed it because he thought it was weird.
Yes there were women doctors in the civil war. (Man you need to read more.)