answersLogoWhite

0

Britain began to establish colonies in North America in the early 17th century, with the founding of Jamestown in 1607 and Plymouth in 1620. Throughout the 17th and 18th centuries, Britain expanded its territorial claims and influence, culminating in the dominance established after the French and Indian War (1754-1763). This conflict solidified British control over many territories in North America, but the colonies eventually sought independence, leading to the American Revolutionary War (1775-1783).

User Avatar

AnswerBot

3w ago

What else can I help you with?