answersLogoWhite

0


Best Answer

Most of what is now the US belonged at one time to Spain. The area covered by the Louisiana Purchase (Mississippi valley and Mid-west) was only transferred to French control a few years before the purchase.

Florida, Texas, Arizona and California were all acquired by various means from Spain.

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

6y ago

Yes. During the Age of Exploration, conquistador Hernan Cortes discovered and conquered present-day Mexico for the Spanish Crown. From 1521 until 1821, Mexico remained as an overseas territory of the Spanish Empire.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

Spain does not have any land in the Americas.

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

Yes, but they gave it up after being defeated by the British.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

They were the first to come there, but they did not claim it. France did.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

During the days of the Alamo War, Mexico was in possession of Texas, not Spain. Before that though, Spain did own Texas before Mexico took it over as its own colony.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did Spain once own Florida
Write your answer...
Submit
Still have questions?
magnify glass
imp