No, Germany didn't win the World War 2 at all.
Germany, like Japan, ran out of men and equipment.
They never won the war
The war between Germany and Japan ended in the year 1945.
because of you...
world war 1 ended when Germany was losing alot of land and noticed they would not win they surrendered on November the 11 world war 2 ended when Russia captured Germany and japan surrendered some time in 1945
No. In fact Germany didn't even survive the war as a country; it split into East and West Germany until 1989.
Germany wanted to claim as much land as they could so they could win World War 2.
they did because they spent lots of money on trying to win
Hitler died before WW2 was over.
To help win the victory over the Nazi German forces.
No, otherwise Germany would probably still be under Fascist rule.