The English did not declare war on Germany, Great Britaindeclared war on Germany because Germany had invaded other European countries with which Great Britain had a mutual defense agreement.
Once Germany invaded Poland, it caused England and France to declare war on Germany. It was the official beginning of World War II.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
Germany started a series of invasions of neighboring Countries. France and England had a treaty with Poland to mutually defend each other. When Germany invaded Poland It was ON!
because Neville chamberlain sent a message to Hitler by radio telling him if he didn't get out of Poland by 11 am on September 1st we would declare war he did not get out of Poland so we did declare war.
What is always mentioned is that Hitler ordered the German Army to invade Poland and England had a defense treaty with Poland. What is never mentioned is that Communist USSR (Russia) also invaded Poland from the East at about the same time and France and England did not declare war upon the USSR.
He did not. Germany declared war on England.
England did not declare war on Germany. Great Britain did. England is not an independent nation. Nonetheless the Answer is still Poland.
England declared war with Germany in September 1939.
September
poland
The Germans attacked Poland, this caused England and France to declare war on Germany, so Germany declared war on them.
The Invasion Of Poland by Germany in 1939.
11:15 on September 3rd
England and France declared war on Germany after Germany, (and Russia), invaded Poland.
England and France declared war on Germany after Germany, (and Russia), invaded Poland.
When Germany invaded Poland, France and England declared war on Germany, this started WWll
The USA declared war on Germany, after it was revealed that Germany wanted Mexico to declare war on the USA, and after Germany said they would resume submarine sinking of civilian, neutral ships carrying supplies to England and France.