answersLogoWhite

0


Best Answer

During World War II, Americans severely disliked Germany. The reverse is also true of Germans, since both sides were being fed massive amounts of Propaganda about the other. Currently, Americans and Germans get along well, and there seems to be minimal, if any, residual animosity.

User Avatar

Wiki User

9y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do Americans feel about Germany?
Write your answer...
Submit
Still have questions?
magnify glass
imp