answersLogoWhite

0

Britain and France explored the area and had colonies. France gave most of its colonies in North America to Britain in 1763 after the Seven Years' War. Over the years since then, a constitution was approved and more separation has been achieved. They are still part of the British Commonwealth.

User Avatar

Wiki User

13y ago

What else can I help you with?