answersLogoWhite

0


Best Answer

It's accepted by Germans that Germany started WW2 in Europe. However, it was as a result of serious humiliation and impossibe economic hardship brought about by the Allies and their demands for German land and reparations following the defeat in 1918. This left Germany prostrate with the only option they were good at: militarism. If the victors of WWI had been as enlightened as those of WWII, Germany would readily have been integrated into the democratic world community, just as it did after WWII, and WWII could have been avoided. Today, there is not a more democratic country than Germany. At least not in Europe. Their constitution, mainly designed by the Allied military occupation forces after 1945, forbids Germany's military involvement of force. They make good allies, but not fighting allies. The fight has been taken out of them long ago. There is one issue, hardly ever mentioned in the U.S., that appears to relegate Germany to an 'invisible' role in international matters of interest. In the U.S., Germany is hardly ever mentioed in any positive way. There is some sensitivity among the Germans about this phenomenon, and some Germans believe that some Americans avoid even an overflight accross German territory as a symbol of disdain or fear. How can modern, democratic Germany ever regain the worlds full respect when there are those, openly supported in their intention that only what they have coined as "The Holocaust" is the most important event associated with Germany at every opportunity even in the 21st Century. Hal Mayer

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

15y ago

# Boundless expansion # The eradication of Communism and its supposed 'biological root and carrier' - the Jews. These were the main official aims at the time. Since the end of World War 2, however, most Germans have regarded the war as a disaster.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

I don't know how the Germans in Germany feel about World War 2. I have heard they are ashamed of what happened in that war by the Nazis. I can tell you about a German man who was put into a concentration camp simply because he had been born in Russia but raised in Germany and was naturalized as a German. I worked for him. He hated that War, the Nazis and all the atrocities he suffered as a prisoner - and he was a German. Many of the other Germans I know here in the United States cannot tolerate talking about that war and what happened to them before they escaped the Nazism. They too are ashamed and they want the entire subject to go away. EXCEPT: They want everyone to realize the Holocaust really did happen. I am glad no one ever told my boss friend the holocaust did not exist considering he had the tattoo on his arm.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

to take control of Europe and eventually the United Stats and then eventually the world

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

For the most part, they were quite enthusiastic about it in the beginning.

This answer is:
User Avatar

User Avatar

Wiki User

15y ago

That it ended badly.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

Yes, Germany was like my bootyhole.

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
3y ago

spwot

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What was the perspective from Germany of World War 2?
Write your answer...
Submit
Still have questions?
magnify glass
imp