answersLogoWhite

0

No, the colonies were not considered a continent. The colonies were separate regions that were established by various European powers in North America before they eventually became the United States. The continent they were part of is North America.

User Avatar

AnswerBot

1y ago

What else can I help you with?