What occured after WWI was that the German people felt that the loss of the war was the result of politicians and high level military personnel, along with the Jews, back stabbing the people by attempting to come to a surrender agreement before the military lost the war. Also, the Treaty of Versailles imposed harsh restrictions and reprerations on Germany and it's military. Government became non-functioning because of it's incredible fragmentation of political parties. Economic inflation reached a point where money was essentialy useless, and housewives would use it to literally start fires. No assistance of any kind was given to help rebuild or repair their economic, political and social infrastructure. It was very unstable and volitile. After WWII, it was critical to show the people that their military was completely defeated and no win was possible. Secondly, harsh penalties were avoided to keep the German people from resorting to extremist political groups for direction. Germany was thus spilt into 4 states controlled by the US, UK, France and USSR. The goal of the US, in short, was to create a new political system and gradually allow Germany a military. Also, to assist in the rebuiling of their economic infrastructure and to help rebuild their country after 6 years of a very destructive war, so that one day Germany would be a self-governing and stable state.
Nobody does anything for one reason; there are always several reasons, sometimes inconsistent, for most actions.
An impoverished, war-wracked society is prime breeding grounds for tyrants; look at Syria, Iraq or Libya these days. Historically, the German culture has always been warlike; Germany started three European wars between 1870 and 1940. If the Allies had destroyed Germany and then gone away, the Germans would have raised up another tyrant to rule them to replace the one we'd just deposed.
If we had abandoned Germany, the Soviet Union would have taken over ALL of Germany, not just the eastern part.
Poor Germans from a devastated Germany wouldn't have been able to afford anything that Americans wanted to sell them. Rebuilding Germany was in America's best interests, just in terms of creating markets for our products.
Germany would have won the war.
In World War 1 and the 20 or so years before 1914 some Germans wanted Germany to acquire the status of a 'world power'. This had different meanings for different people. What led Germany to 'want to dominate the world' was the most extreme variety of German nationalism, Nazism. This wish wasn't, in itself, a result of World War 1.
Germany fought the us in world war 1
Hitler wanted to control all the world and Russia was part of that. It was also a threat and one of the countries that defeated Germany in World War I.
World War 2 was started when the allies declared war on germany because Germany invaded poland. Germany invaded poland because Hitler wanted to reclaim the kand that germany lost after world war 1
World War 2
In 1939 Germany invaded Poland, starting the war.
Great Britian declared war on Germany.
Poland
The reason why it lead to the war was because people didnt like the fact that Germany want to take over so they declared war and all that is when World War 1 happen!
Germany did want a general war. Germany wanted to become a dominant world power by replacing Britain, but they needed more colonies to do so.
countries have power and germany to stop fighing
cause hitler want to
Adolf Hitler served in world war 1 and was mad that Germany lost. other countries made Germany pay their war debts. so he became leader of Germany and started world war 2. he was also trying to take over the world. still Germany lost and Hitler failed to restore the nation to glory.
On September 1st, 1939, Germany invaded Poland and started WWII.
The people that were left wanted to punish Germany
All over europe, like in france, germany, belgium etc