Foolishly, yes, they did, December 1941.
It has never been made clear why but Hitler/Germany did declare war on the USA at this time, right after Pearl Harbor was attacked by the Japanese. That attack, of course, led the USA to declare war on Japan.
Possible reasons Germany engaged the USA:
1) They felt they were already at war. The USA was actively arming and financing German enemies. The USA had attacked many German naval vessels and even murdered some helpless German sailors.
2) Germany hoped that by going against the USA they could get Japan to attack the Soviets in Siberia. No such luck though, as the Japanese were completely outclassed on land to fight a major European army and they knew it.
3) Hitler may have thought he had already won the European war and might as well get into a fight with the USA too. It appeared at the time that Moscow would fall to German forces, which would have crippled the Soviet chances for victory.
Thus Germany would only need to sit and defend their European empire from the west, a relatively easy situation. In other words, Hitler might have thought that the USA involvement in Europe would be of little consequence and a declaration of war was just a political statement.
In retrospect it would have been smarter for Germany to hold off on a declaration. Instead they should have worked to stabilize the eastern front with Russia and build up western defenses. They needed to wait and see what happens between Japan and the USA. With an active war, the USA would probably have pulled back some support for Britain/Russia and put all their military effort into the fight in the Pacific.
It didn't. Germany started invading other countries which made France and the UK declare war on Germany.
Germany only officially declared war on the USA.
5 august 1914
No. It was vice versa.
Germany did not declare war on Britain. Britain declared war on Germany to protect Belgium.
Germany after they invaded poland, austria, and france
RMS Lusitania was sunk by German U boatThe US did declare war on Germany during the World War 1 because Germany had offered Texas, California, Arizona and New Mexico to Mexico.
Yes they did. Twice.Actually, I believe Germany technically declared war on the US during the second world war (1941).
Germany didn't declare war on the US in world war 1. It was the US that declared war on Germany on April 6th 1917 as a result of the unrestricted submarine war introduced by Germany in January that year. - I Warner
Quite the opposite, the US did not declare war on Germany, but Germany declared war on them. They did this to honor their alliance with Japan after the Pearl Harbor attacks.
Why did germany declare war on france?
When Germany invaded Poland in 1939.
Never really had a chance to, it was occupied by Germany in the late 1930's.
Britain warned Germany that they would declare war on Germany if Germany attacked Poland. When Germany invaded Poland the British had no choice but to declare war.
In reality, no president has ever declared war on Germany, since Congress is the entity with the power to declare war. The United States has fought against Germany twice--both World Wars. Woodrow Wilson was president during World War I, and Franklin D. Roosevelt was president during World War II.
If this question is in reference to World War II, most countries declared war on Germany because Germany declared war on them.
The United States was bombed by the Japanese at Pearl Harbor which cause a declaration of war but Germany had an alliance with Japan and it had forced the United States to declare war on Germany too. Then Germany had an alliance with Italy which cause the United States to declare war on yet another country. Then the world was at war and therefore World War 2 started.
It was Germany who declared war on Russia on June 22 1941