The Treaty of Versaille ended World War I.
It was the surrender of Germany in World War II that ended it.
world war 1 ended when Germany was losing alot of land and noticed they would not win they surrendered on November the 11 world war 2 ended when Russia captured Germany and japan surrendered some time in 1945
No, the monarchy in Germany ended with World War 1 in 1918.
World War 2
The Holocaust ended when Germany lost World War II. Hitler had already committed suicide when the war officially ended.
There is no war in Germany that is still being fought. (If you are asking about World War 2, that ended in 1945).
They both ended with Germany defeated.
Germany
The war between Germany and Japan ended in the year 1945.
In world war 1,germany started the war,but in world war 2,japan started it and usa ended the war
Treaty of Versailles