For almost all of its history African communities were not controlled by any outside force or nation. Until the late 19th century, European countries were happy to limit their presence to a modest number of trading stations along the coast. Only around 1890 and then for no more than 70 years in all, Africa was colonized. The initial reason, strange as it may sound, was mostly the European effort to wipe out slavery within Africa itself, which local rulers in the regions south of the Sahara refused to do away with.
The countries colonizing Africa were mostly England, Belgium and France, with Portugal, Italy and Germany playing a secondary role. All other countries in the world did not, as in your question, control any territory in Africa.
The United States has never held any territory in Africa.
The United States
He didnt necesarily buy it. the Mexican-American war decided who won the Louisiana territory, and we won it from Mexico :)
"africa has a allot of natreral recourses that their counrties didnt like dimonds" *Africa *a lot *natural *resources *countries *didn't *diamonds
* no, they didnt
it didnt
they didnt.
no it didnt
It didnt really end in Africa, it was towards the end but it didnt end the war. Operation Torch was conducted by both American and British forces. Hitler wanted to control all of Africa to control the Mediterranean. The US was after Casablanca and to remove all threat from the area as well. The American troops ultimately pushed hard and General Juin surrendered later that day. Therefore the American troops ended the conflict in Africa before it could even begin
there are know mountain in africa because mountains werent cool in africa so they didnt grow any.
They didnt there dead
They didnt become a colony