answersLogoWhite

0

Did the Spanish colonize the Americans?

Updated: 8/21/2019
User Avatar

Wiki User

6y ago

Best Answer

Yes, the Spanish colonized the southwest and Florida.

User Avatar

Wiki User

6y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did the Spanish colonize the Americans?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What groups of native Americans presented a problem for Spanish efforts to colonize Texas?

the Comanche where the main problem for the Spanish to colonize Texas


Why did the americans colonize the Philippines?

The Americans wanted to colonize the Philippines because they have a deal or the treaty of Paris with the Japanese who sell the country to them


Which Eropeon country was the first to establish colonies in the Americans?

This is up to debate, but based on history it was first discovered by the Spanish but the first to colonize was the British.


Did Spanish colonize Colombia?

yes


When did the Spanish enter Canada?

If the question is "When did the Spanish colonize Canada" the answer is never.


How did Britain colonize Canada?

the Native Americans helped them


Why did the Spanish colonize Mexico?

God, gold and glory


Why did the Anglo Americans colonize tx?

because THEY WANTED LAND


How did Great Britain colonize Canada?

the Native Americans helped them


Why are the Native Americans called Native Americans?

Because they were the first people to colonize (most of) the continent.


Who was the first and largest group of europeans to colonize Brazil?

The First European group to colonize Brazil was...The Spanish


Which Spanish explorer received permission to colonize Florida?

Juan Ponce de León got permission to colonize Florida.