hy
Brazil - they speak Portuguese.
No. Florida is a state in the country of the USA.
The English Claimed Florida controlled the entire east coat while Spain got modern day new Orleans and Louisiana
Florida is not a country, it is one of the states of the US
No, Florida is a state.
Florida is a state..........
The United States purchased NO country or territory from Florida (Florida is part of the United States).
well, Florida is a Country....
The country that originally colonized the state of Florida was the Spanish.
The country that is due east of Florida is the Bahamas.
Spain was the first country to explore the coast of Florida.