answersLogoWhite

0

The European Imperialism was still evident after World War I because of increased tensions between European countries leading to militarism and formation of alliances.

Aepx- France and Great Britain took over former German colonies.

User Avatar

Wiki User

9y ago

What else can I help you with?