answersLogoWhite

0

The Territory of Florida was an organized incorporated territory of the United States that existed from March 30, 1822, until March 3, 1845, when it was admitted to the Union as theState of Florida. The territory was originally the Spanish colony of La Florida, which was ceded to the United States as part of the 1819 Adams-Onís Treaty.

User Avatar

Wiki User

16y ago

What else can I help you with?

Related Questions

What year did Tallahassee become capital of Florida?

Tallahassee was founded as the capital of the Florida Territory in 1824.


What happen when Florida become a us territory?

it shouted for joy and pooed its pants


Was Florida a state or a territory?

Florida was a territory state


Did Florida become English territory through the Treaty of Paris of 1793?

No, Florida did not become English territory through the Treaty of Paris of 1793. In fact, the territory of Florida was ceded to the British by Spain in the Treaty of Paris in 1763, following the Seven Years' War. However, Spain regained Florida in 1783 through the Treaty of Paris that ended the American Revolutionary War. The 1793 treaty did not pertain to Florida; it primarily dealt with other matters related to the conflicts of the era.


Where is the Florida territory located-?

The Florida territory is now known as just Florida. it is located in the Southern part of the United States.


What power claimed the territory of Florida?

Spain claimed the territory of Florida


What is the difference between Florida and the Florida territory?

Florida territory included land that is now part of Alabama and Mississippi.


How did the U.S acquire the territory of Florida?

The US acquired the territory of Florida from Spain ceding it to the US.


How did the unites states gain the territory of Florida?

the US gained Florida territory by threatening to police the territory so the Spanish gave it to them.


What territory did the US acquire from Spain in 1819?

Florida


How do you get child support if the other party is not working in Arizona?

Maybe, they are working in Florida. Check there. Then, learn to speak English, Moron.


What country did the United States purchase from Florida?

The United States purchased NO country or territory from Florida (Florida is part of the United States).