answersLogoWhite

0

Yes, in America slavery did really end after the Civil War. After the war was over slavery was abolished and was no longer legal.

User Avatar

Wiki User

10y ago

What else can I help you with?