answersLogoWhite

0

Colonists, such as the European people who colonised the American continent, believed they had the right to settle in those lands because they were "unclaimed" by anyone else (e.g. not part of another European state). Generally speaking they did not recognise any native American countries and also considered it a divine obligation to impose the Christian religion on the "uncivilized" and "pagan" native people of America who would otherwise live damned lives of ignorance and barbarity.

Of course, viewed with modern morality, the colonists had no "right" to take other peoples lands.

User Avatar

Wiki User

14y ago

What else can I help you with?