Which continent is Namibia on?

Namibia is in Africa, along the southwest (Atlantic) coast.

It was formerly the German colony of South-West Africa and was claimed by South Africa after World War II, and became an independent republic in 1990.