the Germans and WW2 basically gave America a god complex, they think it made them awesome (since they think they won it themselves), in reality it is this very trait that makes the rest of the world hate America
The Germans did not invade North America during WWII.
No. America fought against the Germans.
because of hitler
The Germans caused it because if the Germans got a hold of Newfoundland a province in the Atlantic they could then bomb America and take over America
choices
The Germans did not invade North America during WWII.
By ship probably.
The Germans
Probably most Germans were hated. But them USA Germans might not have!
100,000 Germans came to America.
They didn't invade North America.
because ship
tehe
The Germans
they were bored
they came in 1700
No. America fought against the Germans.