Spain, England and France were the European countries that established major colonies in the United States.
great Britain, Spain, the Netherlands
Portugal, France, Spain
Spain, England and France were the European countries that established major colonies in the United States.
Portugal, Spain, Holland (Netherlands), England, France.
england,holland,spain,france,portugal
France, Britain, Denmark-Norway, and the Netherlands had colonies in North America along with Spain. Other countries, such as Russia and Portugal, claimed territory in North America but never established settlements.
The first counties to establish colonies in America were Spain, France, and Great Britain
Spain established colonies for wealth and to convert people to christanity.
Portugal and Spain were the two European countries that focused on exploration and colonization of the southern continent, which is now known as South America. Portugal established colonies in Brazil, while Spain had extensive colonies throughout South America.
If you are referring to the 13 colonies which later became the United States, it was England and from 1707 Great Britain which established them. They weren't the first setllements, though: Spain and France had already established themselves on the continent too.
Countries are not ruled by other countries. The answer is none, nor does Spain have colonies..