The US went to war with Germany because Germany declared war on the US, on December 11, 1941. This was four days after Germany's ally Japan attacked the US fleet in Pearl Harbor, and three days after the US declared war on Japan. Germany was not obligated to declare war against the US under the terms of its treaty of alliance with Japan. Germany would have been obligated only if Japan were attacked, and not if Japan was itself the aggressor. Nevertheless Germany decided to open hostilities.
Probably most Germans were hated. But them USA Germans might not have!
The Germans believed the ships were carrying war supplies to the Allies.
The Germans caused it because if the Germans got a hold of Newfoundland a province in the Atlantic they could then bomb America and take over America
Nazis.
Badly. Lots of anti German Sentiment. Lynching happened to Germans. http://en.wikipedia.org/wiki/Anti-German_sentiment
No. America fought against the Germans.
Germans started World War I in 1914 and World War II in 1939. Additionally, Germans initiated the Holocaust during World War II which led to the systematic genocide of six million Jews.
The Germans
the Germans
The Germans
the Turks and the Germans
Horses and Americans