The United States new after Nazi Germany took over France that they would have to be invloved in the war, Great Britain would be alone with no help. Great Britain resisted the invasion and also succeeded. Also after Pearl Harbor the U.S declared war on Japan and its Aliies.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
December 11, 1941. only hours after Germany declared war on the US
Germany and Italy declared war on the US on Dec. 11, 1941 after the US declared war on Japan because of the bombing of Pearl Harbor on the 7th. The United States never did declare war on Germany/Italy.
Because the United States had declared war on Japan, Germany's ally.
There were several events leading up to the US calling for war against Germany. These included the Zimmerman Note, Germany announcing a policy of unrestricted submarine warfare, and the sinking of several American ships.
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
germany
1917
in 1942
Besides the obvious, because Germany declared war on the US hours before.
Germany
The immediate cause was Japan's attack on Pearl Harbor. This caused the US to declare war on Japan, which led to Germany and Italy declaring war on the US.
December 11, 1941, immediately after Germany declared war on the US.
December 11, 1941. only hours after Germany declared war on the US
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.
Thats what i want to know!