Because the color of their skin made them differ from the whites
No, only in the south colonies. The north colonies were against slavery. There were few in the north colonies.
Slavery was finally abolished in French colonies in 1848
Slavery provided labor for the developing textile industries in the southern colonies.
If you mean the American colonies (there were others) the answer is no.
Slavery has been around for thousands of years and only becomes unacceptable by a society when that society realizes what they are doing is wrong. In the US it was not an accepted part of many colonies because they saw owning other people wasn't right. There are still countries today that slavery is well and thriving.
The colonies of Georgia and North Carolina initially opposed slavery. However once plantations begin being built there was a need for cheap labor and slavery was accepted.
puritans Quakers accepted slaves in there new colonies
Slavery arrived in the colonies in 1619 so the colonies started with slavery.
No, only in the south colonies. The north colonies were against slavery. There were few in the north colonies.
No, only in the south colonies. The north colonies were against slavery. There were few in the north colonies.
slavery
The South Colonies. :)
Slavery was finally abolished in French colonies in 1848
Slavery provided labor for the developing textile industries in the southern colonies.
The "plantation colonies" allowed slavery. Those colonies were Maryland, Virginia, North Carolina, South Carolina, and georgia.
In Colonial America, all colonies had slavery when the Revolutionary War began.
Slavery was introduced to the British colonies to support the labor-intensive cultivation of crops.