December 11, 1941, immediately after Germany declared war on the US.
1941 late December.
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
Germany didn't declare war on the US in world war 1. It was the US that declared war on Germany on April 6th 1917 as a result of the unrestricted submarine war introduced by Germany in January that year. - I Warner
germany
in 1942
Besides the obvious, because Germany declared war on the US hours before.
Germany
Germany and Italy declared war on the United States on December 11, 1941 in compliance with their agreement with Japan to do so; and the US responded by declaring war on both Italy and Germany. In fact the the US and Germany had been in in a virtual state of war for almost a year already considering the American's active support for Britain and the Soviet Union through lend lease, and the Atlantic convoy system.
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.
December 11, 1941. only hours after Germany declared war on the US