answersLogoWhite

0

It completely destroyed it in my opinion. England had to give away everything it had, and go into massive debt to "win" ww2. When you compare German society to British society today, it's glaringly obviously that Germany is the the overall winner. People are less aggressive, there is far less violence in the streets, and the young people are quite polite and still have respect for their elders. Don't believe me? Have a look on German news websites, how many stabbings/beatings/murders do you see each day? Nothing compared to the count in England. I went back to England last year after nearly 30 years absent, and was completely shocked, England no longer exists for me.

User Avatar

Wiki User

14y ago

What else can I help you with?