answersLogoWhite

0

The Germans do teach about Hitler and the Nazi stuff. They know it was wrong. They are trying not to fall for something like that again. They don't fly their flag as proudly as the Americans do, but you can see it when there's a soccer tournament or something like that is going on.

Trust me, I've lived there all my life!

User Avatar

Wiki User

14y ago

What else can I help you with?