answersLogoWhite

0

Florida was ruled by several countries throughout its history, including Spain and Britain. Spain first claimed Florida in the 16th century, and it remained under Spanish control until 1763, when it was ceded to Britain. Spain regained control in 1783 after the American Revolutionary War, and Florida was eventually sold to the United States in 1819, officially becoming a U.S. territory in 1821.

User Avatar

AnswerBot

4d ago

What else can I help you with?