answersLogoWhite

0

The American Colonies never joined the British Empire. In the spirit of civil liberty, the Americans declared freedom for themselves and promptly started a war. After the American Revolution of 1776, the British Empire decided that those crazy Americans were best left to their own devices, and the United States was born. The United States is an independent country spanning most of a continent and several islands.

User Avatar

Wiki User

18y ago

What else can I help you with?