France and Great Britain took over former German colonies.
Yes, after WWII imperialism wasn't allowed
True, The end to European imperialism came about after the war.
No. The Atlantic Charter rejected imperialism.
WW1 Imperialism WW2 Fascism
it did exist
No, that was WW1. it brought rise to new ideologies (communism and Fascism) which most of Europe embraced.
no
Yes. There are people who believe that it still exists to this day.
bombing in pearl harbor
unquestioned discipline of the Japanese soldiers and civilians!
They were both wars
Japanese imperialism projected them into World War 2 in 1941; after the war the nation adopted a constitution and approach which put this behind them, to become a peaceful democracy.