acutally the only country Germany declared war on was the united states no other country only them. why u ask well germany didn't declare war on any other countries because the other countries they were at war with already declared on germany and there was no need to declare war on them
Hitler declared war on many countries during World War Il. Some of these places include Great Britain and the United States.
Germany only officially declared war on the USA.
1 - The USA. He never declared war on any of the countries he invaded. A couple days after Japan bombed Pearl Harbor, Hitler declared war on the USA (one of his many mistakes).
Believe it or not, the US was the only country Hitler officially declared war on. Hitlers motives were unclear, but yet, not many of them are.
answered by: **El Tay**
In both.
what countries wre defeated in world war 1
It didn't. Germany started invading other countries which made France and the UK declare war on Germany.
France and Britain declared war onto Nazi Germany on September 1st 1939. This was the official date of the starting of World War 2.
If this question is in reference to World War II, most countries declared war on Germany because Germany declared war on them.
answered by: **El Tay**
England and France declared war on Germany after Germany, (and Russia), invaded Poland.
England and France declared war on Germany after Germany, (and Russia), invaded Poland.
answered by: **El Tay**
Germany was the first country to declare war on Serbia in World War 1. The other countries were pulled into the war because the had alliances with either Serbia or Germany.
The dislike of Germany from multiple countries began after their invasion of Poland. During WWII, the first two countries to declare war on Germany were Great Britain and France.
answered by: **El Tay**
no
germany
The English did not declare war on Germany, Great Britaindeclared war on Germany because Germany had invaded other European countries with which Great Britain had a mutual defense agreement.
yes