The US bought the state of Florida from the Spanish.
No, Florida became a US State in 1845 and the Spanish American War was in 1898.
Florida means "flower-filled" in Spanish.
the spnaish moved to Florida :^) .....................................................................
No. Spain sold Florida to the US many years earlier.
Yes it has but then the US traded it with the Spanish for Florida.
Yes
Spanish
Southwest and Florida along with California.
they wanted them to live
Florida did not get involved, since it wasn't at the time one of the thirteen colonies that rebelled against the British. After the US war of independence the British ceded Florida to Spain. The Spanish sold Florida to the US in 1819 for US$ 5 million.
Colorado,, Montana, Nevada, Florida, & California