It's accepted by Germans that Germany started WW2 in Europe. However, it was as a result of serious humiliation and impossibe economic hardship brought about by the Allies and their demands for German land and reparations following the defeat in 1918. This left Germany prostrate with the only option they were good at: militarism. If the victors of WWI had been as enlightened as those of WWII, Germany would readily have been integrated into the democratic world community, just as it did after WWII, and WWII could have been avoided. Today, there is not a more democratic country than Germany. At least not in Europe. Their constitution, mainly designed by the Allied military occupation forces after 1945, forbids Germany's military involvement of force. They make good allies, but not fighting allies. The fight has been taken out of them long ago. There is one issue, hardly ever mentioned in the U.S., that appears to relegate Germany to an 'invisible' role in international matters of interest. In the U.S., Germany is hardly ever mentioed in any positive way. There is some sensitivity among the Germans about this phenomenon, and some Germans believe that some Americans avoid even an overflight accross German territory as a symbol of disdain or fear. How can modern, democratic Germany ever regain the worlds full respect when there are those, openly supported in their intention that only what they have coined as "The Holocaust" is the most important event associated with Germany at every opportunity even in the 21st Century. Hal Mayer
# Boundless expansion # The eradication of Communism and its supposed 'biological root and carrier' - the Jews. These were the main official aims at the time. Since the end of World War 2, however, most Germans have regarded the war as a disaster.
I don't know how the Germans in Germany feel about World War 2. I have heard they are ashamed of what happened in that war by the Nazis. I can tell you about a German man who was put into a concentration camp simply because he had been born in Russia but raised in Germany and was naturalized as a German. I worked for him. He hated that War, the Nazis and all the atrocities he suffered as a prisoner - and he was a German. Many of the other Germans I know here in the United States cannot tolerate talking about that war and what happened to them before they escaped the Nazism. They too are ashamed and they want the entire subject to go away. EXCEPT: They want everyone to realize the Holocaust really did happen. I am glad no one ever told my boss friend the holocaust did not exist considering he had the tattoo on his arm.
to take control of Europe and eventually the United Stats and then eventually the world
For the most part, they were quite enthusiastic about it in the beginning.
That it ended badly.
Yes, Germany was like my bootyhole.
spwot
well Europe had benefited from the war and changed their whole perspective about the other countries.
Germany was very much an autocracy during World War 2.
in world war 2 Germany attacked japan in hiroshima and nagasaki
Germany started world war 2 when they invaded poland
France, Britain and the United States occupied Germany in the west after World War 2.
well Europe had benefited from the war and changed their whole perspective about the other countries.
No, the last war in Germany was World War 2
Japan and Italy fought against Germany in World War 1 but were on Germany's side in World War 2
World War 2
who won in world war 2, between Trinidad and Germany
Germany and Japan were allies during World War 2. They were not at war with each other.
Germany was in WW I and WW II.
Germany was very much an autocracy during World War 2.
Germany's currency during World War 2 was the Reichmark .
What the problem was in world war 2 Germany,Japan,and Italy wanted to conquer the world mostly Germany.
No. Germany started World War 2 in Europe and remained at war till the World War 2 ended. Germany attacked the Soviet Union in June 1941, and the latter played a key role in defeating Germany.
no