answersLogoWhite

0

If you mean, What happened to slaves once they reached the Americas, a lot of things happened. First they were sent to the auction house, where they were sold to many white folks, known as "masters". If they were not sold, they would be left out on the streets to die of hunger, or in their own filth. the main people who suffered this brutal treatment were the women who had infants to look after. In those days, women were considered to be weak, and with an infant, the master's food bill would rise and so he would not be able to afford to keep "his" slave.

If you mean, What happened to the slaves after the American Civil War, well, most were released, but lived as minors to the white people and were discriminated against severely, for the rest of their lives.

User Avatar

Wiki User

14y ago

What else can I help you with?