People in the US viewed Germany, both in WW1 and WW2 as part of Europe's problem. The US wanted no part of Europe's wars.
During WW2, when the US had no choice but to enter it, due to the attack at Pearl Harbor, US citizens wanted and demanded war against Japan. Japan ATTACKED the US; NOT Germany.
However, FDR (President Franklin D. Roosevelt) had been receiving political pressure from Britain's Primer Minister Winston Churchill since 1939 to ally itself with England and enter the war against Germany. The issue being, Britain needed the help of the US in order to survive! Consequently, FDR bent to the will of Churchill and agreed to saving Europe FIRST. Thus creating a defeat "Germany first" policy. Which angered the people of the United States who wanted very much to avenge the attack at Pearl Harbor.
After Germany declared war on the US (11 December 1941) the Nazis became extremely unpopular in the US.
Not so good.
American , British, & Russian Governments
No they were allies
The Nazis did not exist during World War 1. The Nazi party was formed in 1920 and rose to power in Germany in the 1930s. They played a significant role in World War II, not World War I.
The Nazis took their land and money and killed them
Adolf Hitler and the Nazis committed genocide during WW2, against the Jews.
The Nazis persecuted the Jews During WWII
No, the Nazis did not have schools in the US.No, the Nazis did not have schools in the US.
Yes, by the Nazis.
The Nazis were involved in Denmark during the World War 2.
The Nazis were involved in Denmark during the World War 2.
Adolf Hiltler
All over Europe.
American , British, & Russian Governments
Great Britain did not.
The Nazis made these camps during world war 2
The Nazis invaded Czechoslovakia.
your moms coach