answersLogoWhite

0

The end of slavery in America arrived in stages both during and after the Civil War. Beginning with Lincoln's Emancipation Proclamation in January 1863, continuing with the Union's final victory over the South in April 1865, then through national legislation and practical reorganizations of the post-war South in the months (and years) that followed, slavery finally came to an end as a living institution in American society.

User Avatar

Wiki User

10y ago

What else can I help you with?