no
No, Germany didn't win the World War 2 at all.
They never won the war
The war between Germany and Japan ended in the year 1945.
because of you...
No. In fact Germany didn't even survive the war as a country; it split into East and West Germany until 1989.
world war 1 ended when Germany was losing alot of land and noticed they would not win they surrendered on November the 11 world war 2 ended when Russia captured Germany and japan surrendered some time in 1945
Germany wanted to claim as much land as they could so they could win World War 2.
No, the last war in Germany was World War 2
No, otherwise Germany would probably still be under Fascist rule.
To help win the victory over the Nazi German forces.
Hitler died before WW2 was over.