Yes. But primarily due to the formation of the United Nations, and advent of the global military power of the US and Soviet Union. Colonialism had essentially run its course, with few areas of the globe left to conquer or claim.
Hitler kept taking over more and more countries. Britain was appeasing them and allowing them too, but Hitler went too far and they declared war on Germany.
YES. it was the main cause.
True, The end to European imperialism came about after the war.
Yes, after WWII imperialism wasn't allowed
No. The Atlantic Charter rejected imperialism.
WW1 Imperialism WW2 Fascism
No, that was WW1. it brought rise to new ideologies (communism and Fascism) which most of Europe embraced.
Yes. There are people who believe that it still exists to this day.
unquestioned discipline of the Japanese soldiers and civilians!
bombing in pearl harbor
Japanese imperialism projected them into World War 2 in 1941; after the war the nation adopted a constitution and approach which put this behind them, to become a peaceful democracy.
Imperialism (the desire to build a large empire)
Mainly because of Imperialism and Militarism. The Nazi German "Blitzkrieg" of Poland in 1939 would start the War.
The cold war was.