I'm assuming you mean what NON-native country. The Dutch were there there. The Portugese were there. The British were there, and stayed the longest. Their rule wasn't necessarily bad, but when they left they really screwed it up and millions of people died just because of their religion. Of course, before them the Greeks were there, the Mongols, the Turks, etc etc etc.
India was first colonized in 1497
Most of West Africa was colonized by France.
India and Burma
Thailand has never been colonized by any nation.
The British Empire colonized the nation we now call Guyana. It was called British Guiana until the nation gained its independence in 1966.
the British Empire
India was known as the jewel in the crown, and was colonised by Great Britain.
India was first colonized in 1497
The United Kingdom colonized the African nation of Lesotho.
by the british
Several countries colonized India... if you search "Imperialism in India" and click the first link, it will give you all of the western countries in which colonized India.
In India of course.
Several countries colonized India they were Portugal And England they were the main countries to do so.
The European nation that colonized India and Australia, while also having spheres of influence in China, is the United Kingdom. The British established control over India through the British East India Company and later direct rule, while Australia was colonized by British settlers in the late 18th century. In China, the UK exerted influence through treaties and concessions following the Opium Wars. This colonial expansion significantly shaped the history and development of these regions.
yes
Answer this question… Both countries were colonized by European imperial powers. India was colonized by Great Britain, and Haiti was colonized by France.
Most of West Africa was colonized by France.