answersLogoWhite

0

It's accepted by Germans that Germany started WW2 in Europe. However, it was as a result of serious humiliation and impossibe economic hardship brought about by the Allies and their demands for German land and reparations following the defeat in 1918. This left Germany prostrate with the only option they were good at: militarism. If the victors of WWI had been as enlightened as those of WWII, Germany would readily have been integrated into the democratic world community, just as it did after WWII, and WWII could have been avoided. Today, there is not a more democratic country than Germany. At least not in Europe. Their constitution, mainly designed by the Allied military occupation forces after 1945, forbids Germany's military involvement of force. They make good allies, but not fighting allies. The fight has been taken out of them long ago. There is one issue, hardly ever mentioned in the U.S., that appears to relegate Germany to an 'invisible' role in international matters of interest. In the U.S., Germany is hardly ever mentioed in any positive way. There is some sensitivity among the Germans about this phenomenon, and some Germans believe that some Americans avoid even an overflight accross German territory as a symbol of disdain or fear. How can modern, democratic Germany ever regain the worlds full respect when there are those, openly supported in their intention that only what they have coined as "The Holocaust" is the most important event associated with Germany at every opportunity even in the 21st Century. Hal Mayer

User Avatar

Wiki User

16y ago

What else can I help you with?