answersLogoWhite

0


Best Answer

A few were put on trial; some of the wanted men managed to get to South America (especially Argentina and Paraguay). Most, however, went back to Germany and in most cases were 'denazified' by special tribunals, that is, 'cleared'.

User Avatar

Wiki User

16y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

11y ago

Germany was defeated by the Allied forces. They surrendered on May 1st 1945.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happened to the Nazis after World War 2 ended?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What happened after World War 2 ended?

the Korean war


What happened at the end of world 1?

World war 1 ended


Who said the World War 2 was over and when?

the war ended because the Hitler Nazis (Germans) Nazis surrendered the war (gave up) because in August the ALLIES troop liberated the Jews from work camp then Nazis gave up


What big occasions happened in 1911?

world war 1 ended


What happened on the first armistice day?

World War 1 ended


What happened to the fighting in France when America joined World War 1?

it ended


When did 1st world war happened?

WWl stared in 1914 and ended in 1918


What happened after World War I to start World War 2?

The treaty of Versailles, which ended the fighting in World War 1, was very harsh on the defeated Germans, disgracing them and forcing them to give up land. This harsh political climate allowed for a radical political party under the Nazis to gain power and reinvigorate German national pride. The Nazis began rapid rearmament to try to make Germany into a world power again and eventually through their aims started world war 2.


What happened from 1941 1945?

The US entered World War 2 in 1941 and the war ended in 1945.


What happened after the bombing of japan during world war 2?

The war ended on Sept 2, 1945.


What were some major issues with the peace plan after World War 1?

not sure..i have world history,but we haven't gotten far into ww1..were talking about the nazis and the war that happened with them..


What Happened Before the Cold War?

The Cold War began after the Second World War ended in Germany and Japan's defeat.