no not at all they were not involved in the first world war
they died
World War 1 became a world war when the Allies and the Central powers started to invaded colonies in the world especially in Africa
The world may never now
WW2 prevented the total conquer of the Nazis over the world.
Africa was given orders to give up their land for the govenment, which is a mandate
World War I
no not at all they were not involved in the first world war
America didn't fight in World War 1!
For territories in asia and africa.
Bawlz
Germany
Yes!
How did it not.
they died
World War 1 was participated in Germany, Africa and more places around the world ( i think).
World War 1 became a world war when the Allies and the Central powers started to invaded colonies in the world especially in Africa