answersLogoWhite

0

Spain did not do a thing for the American Colonies. It was France and the Netherlands who helped the Colonists. Spain is a Catholic nation and the Colonists were Protestant. Spain did not want to help Protestant Colonist. Spain later sold Florida to the USA, but, that was after the Colonists won the Revolution against England, (with the help of France and the Netherlands).

User Avatar

Wiki User

14y ago

What else can I help you with?