answersLogoWhite

0


Best Answer

No, Florida became a US State in 1845 and the Spanish American War was in 1898.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Was Florida gained by the US at the end of the Spanish-American War?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Was Florida gained by the US at the end of the Spanish -American War?

No. It was purchased and was not a trophy of War.


What country gained control at the end of the war?

French


What country gained control of Canada after the end of the war?

tyjyjdtykyukr


What country gained control of Canada. at the end of the war?

French


When did the war of 1812 in Florida end?

in 1812


What country gained control of Canada at the end of the French and Indian War?

Great Britain gained control over Canada and France lost it at the end of the French and Indian War.


At the end of the French and Indian War which country gained possession of East and West?

England


What hate group gained power after the end of civil war?

Ku Klux Klan


At the end of the Revolutionary War the US was bound on the south by .?

Florida


When did Jean Ribault land in Florida?

When did the French and Indian war end


Territory not gained by us at the end of the spanish-american war?

American Samoa and the US Virgin Islands.


What happened to India and Pakistan at the end of world war 2?

They gained their independence from Great Britain in 1947.