We need to know the war to answer the question.
Women gained jobs, but African Americans lost them.
The end of the first world war gave women more of a right and say in the workplace, as they gained many jobs during the time their husbands were off fighting, and they were able to maintain those.
If it is world war 1, then- Women had a positive affect from the war as they got jobs in ammunition factories as men had to go fight the war. Most wars have proved to provide women with various freedoms, however they have caused destruction to humanity as a whole.
During World War II many women entered the workforce. The end of World War II affected women in the workplace as many of them returned home instead of staying in the workplace.
men had to leave so women had to take over their jobs
It ended economic opportunities for women.
Women did not have much affect on the war, but the ones who did worked in factories making the ammunition. Without them there would be nothing for the soldiers to fire in their guns.
During World War II many women entered the workforce. The end of World War II affected women in the workplace as many of them returned home instead of staying in the workplace.
Women gave up their jobs at the end of World War II in the United States because the men had come home from the War. They had taken these jobs because the workforce needed them while the men were away.
many women took the places of men in the factories and other jobs in the u.s in germany, women did at least a year of farm work to help with the war
During World War II many women entered the workforce. The end of World War II affected women in the workplace as many of them returned home instead of staying in the workplace.
it ended economic opportunities for woman. -nova net