Asked in US Civil War
US Civil War

What did the union gain after the civil war?

Answer

User Avatar
Wiki User
06/27/2014

The Union gained the passage of the Emancipation Proclamation, which was the fight to free slaves in all areas of the United States. The northern states also gained control of some of the territories that they were seeking.