France has never been a colony in the traditional sense. It was first organized into settled civilization under the Roman Empire, but it was a constituent part of that empire, not a colony. Since the fall of the Roman Empire, France was the subject to various different medieval kingdoms competing over territory, most of which are anachronistically French, but some were anachronistically English or German. Towards the end of the Middle Ages, France was united as one nation and has remained as such up to the present day.
Charlemagne was the King of the Franks. The Franks were a Germanic tribe that settled in France.
France colonized Vietnam.
Most of West Africa was colonized by France.
Several European nations colonized America. Spain colonized Florida and parts of the Southwest. England colonized the East Coast and further inland. France colonized the Mississippi River area. The Netherlands colonized the area around New York, but were driven out by England. Russian colonized Alaska.
France.
italy Britain & France also
France colonized Vietnam.
The country which colonized Algeria is France.
Canada was colonized by France and England.
Yes, the French colonized Laos.
Most of West Africa was colonized by France.
Not it not colonized anymore. It use to be ruled by France Cambodia
France, in 1922.
they were colonized by France
France
1893
Yes They Did
France