Because during the Paris Peace Conference following WWI, Germany was left out of any discussion involving the new treaty. Germany was left with a large amount of reparations to pay and land being taken away from them. They were never given a voice, and during the 1920's, they suffered an economic downturn. When the Nazi's rose to power in the 1930's and 40's, they not only placed the blame on minorities and Jews, but the other countries that had treated them so unfairly.
The war was inevitable for a multitude of reasons, however, there are two major reasons. Nazi Germany invaded Poland, which brought England and France to declare war on Germany. Japan attacked the US naval base at Pearl Harbor, Oahu, Hawaii pulling the US into the war.
The war in Europe could have been avoided; Germany had no legitimate reason to invade Poland other than greed and desire to spread their ideology.
The war in the Pacific is a little different. Japan was on a series of conquests of other nations that the USA did not approve of. The USA tried to curb this by imposing sanctions, same thing that happens today, example Iran because of nuclear power, Iraq, Sadam Hussein invasions and human rights violations are two major ones.
Back to Japan, the USA seized all Japans assets in the USA, stopped or reduced oil and other exports. The oil was a big one. Japan had two choices, pull out of areas it had invaded or move to secure oil in other areas by invasion. They chose to invade and figured that they could bomb US fleet in Pearl Harbor, secure new assets and then negotiate with US on peace figuring US would be too weak to continue a war. This was a huge mistake; USA declared war and required unconditional surrender of Japan.
In the end, it could be said that the war in Pacific would have occurred despite what Germany did. The US sanctions where so severe that Japan had to go to war or lose face and submit to US demands.
Germany declared war on the US, the US then declared war on Germany.
No Germany declares war on US
Germany declared war on the US after Pearl Harbour.
December 11, 1941. only hours after Germany declared war on the US
We did not want to go to war until the Japanese attacked Pearl Harbor. Shortly after we declared war on Japan, it was Germany who declared war on us. We had no say in the matter of war with Germany.
I don't agree that the war between Britain and Germany was inevitable because they could always have peace proposals and peace talks where they will talk about what will they do for the betterment of each countries.
Germany declared war on the US, the US then declared war on Germany.
No Germany declares war on US
Germany declared war on the US in the morning, and the US declared war on Germany in the afternoon on December 11, 1941.
The US did enter into an undeclared shooting war with Germany in the fall of 1941 because Germany has declared war on the US.
I don't think that the US declared war on Germany. Germany declared war on the US on December 11th 1941
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
The US declared war on Germany, hour after Germany declared war on the US on December 11, 1941.
Germany declared war on the US after Pearl Harbour.
Germany fought the us in world war 1
The US declared war on Germany on Dec 11th 1941, (two days after Germany declared war on the US). Everything since that date to the present is after.
At the start of the US involvement in WW II, the US was attacked by the Japanese. When the US declared that a state of war existed between the US and Japan, Germany declared war on the US. Your question should be was Germany justified in declaring war on the US.