American wealth enriched Spain because they were trading together. The businesses in America were able to produce useful things for Spain due to their resources.
France and Spain.
no. Actually the French sided with the Americans.
true thank you for your help
The Philippines were part of the "spoils of war" taken from Spain in '98.
They sold conquered territories to the Americans. By sending money and supplies
to take the native americans and others to be catholic.........now i need ur help
to capture British Land. Apex:)
This question would be clarified immensely if the particular war in question were specified.
Spain did not do a thing for the American Colonies. It was France and the Netherlands who helped the Colonists. Spain is a Catholic nation and the Colonists were Protestant. Spain did not want to help Protestant Colonist. Spain later sold Florida to the USA, but, that was after the Colonists won the Revolution against England, (with the help of France and the Netherlands).
atem
America only fought Great Britain, but had help from The Netherlands, Spain, and France.