answersLogoWhite
History of England
Roman Empire
Empire State Building
US Army

How did the US join the empire?


Top Answer
User Avatar
Wiki User
2008-03-15 23:14:34
2008-03-15 23:14:34

The American Colonies never joined the British Empire. In the spirit of civil liberty, the Americans declared freedom for themselves and promptly started a war. After the American Revolution of 1776, the British Empire decided that those crazy Americans were best left to their own devices, and the United States was born. The United States is an independent country spanning most of a continent and several islands.


Copyright © 2020 Multiply Media, LLC. All Rights Reserved. The material on this site can not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Multiply.