answersLogoWhite

0

The terms union confederacy

User Avatar

Anonymous

12y ago
Updated: 8/20/2019

The Confederacy was the group of Southern slave-states that broke away and declared themselves to be a separate nation (not recognised by any foreign countries, except Mexico).

The Union was what was left of the USA. (The North)

After the war, the states were re-united.

User Avatar

Wiki User

12y ago

What else can I help you with?